var/home/core/zuul-output/0000755000175000017500000000000015147274641014540 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015147277405015505 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log.gz0000644000175000017500000206170715147277254020301 0ustar corecore~ikubelet.log_o[;r)Br'o-n(!9t%Cs7}g/غIsdr.k9Gf̅Eڤ펯_ˎ6Ϸ7+%f?ᕷox[o8W5C% oo/q3m^]/o?8.7oW}ʋghewx/mX,ojŻ ^Tb3b#׳:}}ny;#)a "b BLc?^^4[ftlR%KF^j 8DΆgS^Kz۞_W#|`zIlp_@oEy5 fs&2x*g+W4m ɫXӊg*ͼ~aT(Q0Z%bGFPO7M$n6iXύO^%26lDt#3{f!f6;WR.!$5 J:1*S%V!F([EbD]娍ԹiE03`Cfw&:ɴ@=yNIAgfPF:c0Ys66q tH6#.`$vlLH}ޭA㑝V0>|J\Pg\W#N%8'# $9b"r>B)K.(^$0^@hH9%!40Jm>*Nz7l9HGAr Mme)M,O!Xa~YB ɻ!@J$ty#&i 5ܘ=ЂK]IIɻ]rwbXh)g''H_`!GKF5/O]Zڢ>:O񨡺ePӋ&56zGnL!?lJJYq=Wo/"IyQ4\:y|6h6dQX0>HTG5QOuxMe 1׶/5άRIoCMMQQ؏*ΧL ߁NPi?$;g&u8~Y >hl%}Р`sMC77Aztԝp ,}Nptt%q6& ND lM;ָPZGa(X(2*91n,50/mx'})')SĔv}S%xhRe)a@r AF' ]J)ӨbqMWNjʵ2PK-guZZg !M)a(!H/?R?Q~}% ;]/ľv%T&hoP~(*טjM)}?=XkLd. yK>"dgӦ{ qke5@eTR BgT9(TڢKBEV*DDQ$3gFfThmIjh}iL;R:7A}Ss8ҧ ΁weor(Ё^g׬JyU{v3Fxlţ@U5$G~J0!0[r_G{j P>Qwf8*c4˥Ęk(+,«.c%_~&^%80=1JgޛIgǽgr&P29LcIIGAɐ`P-\zʡP=_RFZx[|mi G ʹo7T׋b!g K#XoV甬6xڂ I &m>AtĘ5dw9}ŒEanvVZ?c}!wO,ƍͩ?9} [oF2(Y}Q7^{E}xA|AŜt;yȾtë{{b2hKNh`0=/9Gɺɔ+'Х[)9^iX,N&+1Id0ֶCޢmp.֣?5t罦X[nMcow&|||x:k/.EoV%#?%W۱`3fs䓯ҴgqmubIfpG}o$[4x:bl=pd9YfAMpIrv̡}XI{B%ZԎ|n^/GSZ;m#Nvj{(8xPA*1bv^JLj&DY3#-1*I+g8a@(*%kX{ Z;#es=oi_)qbAewQT*:ÊxtŨ!u}$K6tem@t):êtx: `)L`m 1ZK(O5dc}QQufCdX($0j(HX_$ ޞ22ݡjR:g?m@ڤB^dh NSHֆ^n,hU֝cfT :):[gCa?\&IpW$8!+Up8bU ߷Q뽃J޸8iD WPFn'&&$"THlw`JS[l52, 5 CۈP$0Zg=+DJ%D  *NpJ֊iTv)vtT̅Rhɇ ќuގ¢6}#LpFD58LQ LvqZDOF_[2arH_HI\:U}UE$J @ٚeZE0(8ŋ ϓ{K0D"\KjPQ>Y{Ÿ>14`SČ.HPdp12 ˟tzhpyXڈ2ͤh}/[g1ieQ*-=hiך5J))?' c9*%WyΈ W\Of[=߰+ednU$YD',jߎW&7DXǜߍG`DbE#0Y4&|޻xѷ\;_Z^sнM\&+1gWo'Y;l>V ̍"ޛ4tO,{=hFѓ$b =D(zn;Y<1x~SJ^{vn 9 j1шk'L"cE=K]A(oQ۲6+ktwLzG,87^ 9H\yqū1)\(v8pHA"ΈGVp"c ?Z)hm.2;sl$瓴ӘIe~H|.Y#C^SJĽHǀeTwvy"v܅ ]?22R.lQPa ˆSܫ1z.x62%z].`Gn&*7bd+, Z`ͲH-nမ^WbPFtOfD]c9\w+ea~~{;Vm >|WAޭi`HbIãE{%&4]Iw Wjoru ݜmKnZ<X; ۢ( nx K8.|DXb +*598;w)zp:̊~;͞)6vnM!N5Cu!8Wq/`FUwWAֻ,Qu W@ Fi:K [Av*_958]a:pmQ&'ᚡmi@ zF(n&P;)_]µ!doR0`pl`~9Fk[ٺ+4Hhao-jϸ??R<lb#P-^39T|L /~p│x@Bq"M/lja\b݋af LnU*P(8W[U6WX ZoѶ^SH:K:%Qvl\b FqQI.ȨHWo;Nw$͹O$oEE-eq=.*Dp,V;(bgJ!gF)892sw*+{[or@x,))[o新#.͞.;=fc<)((b۲Eumw峛M2,V[cm,S~ AF~.2v?JNt=O7^r.@DEuU1}g$>8ac#sĢB\PIPfwJQJ;Qxm &GBf\ZA$Ba-z|A-I @x70 晪MV)m8[6-Te@`E|=U D(C{oVa*H7MQK"<O%MTTtx袥:2JޚݶKd7UZihRk71VDqiގ\<:Ѓ3"gJJčE&>&EI|I˿j2ǯɘCGOa9C1L ={fm&'^tigk$DA' elW@Tiv{ !]oBLKJO*t*\n-iȚ4`{x_z;j3Xh ׄ?xt.o:`x^d~0u$ v48 0_ | E"Hd"H`A0&dY3 ً[fctWF_hdxMUY.b=eaI3Z=᢬-'~DWc;j FRrI5%N/K;Dk rCbm7чsSW_8g{RY.~XfEߪg:smBi1 YBX4),[c^54Sg(s$sN' 88`wC3TE+A\.ԍל9 y{͝BxG&JS meT;{З>'[LR"w F05N<&AJ3DA0ʄ4(zTUWDdE3̻l^-Xw3Fɀ{B-~.h+U8 i1b8wؖ#~zQ`/L 9#Pu/<4A L<KL U(Ee'sCcq !Ȥ4΍ +aM(VldX ][T !Ȱ|HN~6y,⒊)$e{)SR#kהyϛ7^i58f4PmB8 Y{qeφvk73:1@ƛ.{f8IGv*1藺yx27M=>+VnG;\<x7v21՚H :[Γd!E'a4n?k[A׈(sob 41Y9(^SE@7`KIK`kx& V`X0,%pe_ן >hd xе"Q4SUwy x<'o_~#6$g!D$c=5ۄX[ു RzG:柺[ӏ[3frl ô ހ^2TӘUAT!94[[m۾\T)W> lv+ H\FpG)ۏjk_c51̃^cn ba-X/#=Im41NLu\9ETp^poAOO&A+mj(^>c/"ɭex^k$# $V :]PGszy".zɪ) ӓT)D:fci[*`cc&VhfFp佬)/Wdځ+ uR<$}Kr'ݔTW$md1"#mC_@:m P>DEu&ݛȘPˬ-Ő\B`xr`"F'Iٺ*DnA)yzr^!3Ír!S$,.:+d̋BʺJ#SX*8ҁW7~>oOFe-<uJQ|FZEP__gi(`0/ƍcv7go2G$ N%v$^^&Q 4AMbvvɀ1J{ڔhэK'9*W )IYO;E4z⛢79"hK{BFEmBAΛ3>IO j u߿d{=t-n3Pnef9[}=%G*9sX,¬xS&9'E&"/"ncx}"mV5tŘ:wcZ К G)]$mbXE ^ǽ8%>,0FЕ 6vAVKVCjrD25#Lrv?33Iam:xy`|Q'eű^\ơ' .gygSAixپ im41;P^azl5|JE2z=.wcMԧ ax& =`|#HQ*lS<.U׻`>ajϿ '!9MHK:9#s,jV剤C:LIeHJ"M8P,$N;a-zݸJWc :.<sR6 լ$gu4M*B(A ݖΑِ %H;S*ڳJt>$M!^*n3qESfU, Iĭb#UFJPvBgZvn aE5}~2E|=D' ܇q>8[¿yp/9Om/5|k \6xH.Z'OeCD@cq:Y~<1LٖY9# xe8g IKTQ:+Xg:*}.<M{ZH[^>m0G{ ̷hiOO|9Y"mma[sSbb'Rv&{@6; KE.a\}:<]Oyve3h9}E[kMD,5 %sO{킒 8.K?]i/`׎tp NvԻV4|<{H@#*h{Yp/E%dlh\bU:E%h@&SEK [ Ƣ xg{z%ǻViX~鮦w35QE~qp[ʕ@}ZL! Z0!A⼏q)[f &E1K3i+`JG P/EG 4 9LڑKL|`PОnG#|}qOR{Q|2_tH߫%pD?1%(@nfxOrs25rMլf{sk7݇fjӞh2HkeL'Wʿ}Ƞ%>9cSH|cEyQp 'ˢd:,v-us"Iidw>%zM@9IqrGq:&_p3õB!>9'0LL]M[lwWVR9I5YpVgtuZfG{RoZr3ٮr;wW:͋nqCRu1y=㊻Ij z[|W%q0 CJV٨3,ib{eH7 mҝ(3ɏO/̗-=OR\dIoHZ6n`R֑&#.Mv0vԬ]I˟vrK}F9X|FI#g.Gi)%!iK|o}|ֵ7!ېATJKB2Z/"BfB(gdj۸=}'),-iX'|M2roK\e5Pt:*qSH PgƉU'VKξ ,!3`˞t1Rx}fvvPXdQSg6EDT:dׁz^DjXp͇G|X5Q9K$)U?o': .,wؓaՁ_ 3]Q16ZYafuvrq^ѷQT},!H]6{Jw>%wK{)rH+"B4H7-]r}7v8|׾~Us?yWfv3>xpRҧH-EeJ~4YIozi:nq Vq8swHOzf ̙eX-4`TDGq G.tݻgq74ŠqBFf8 9Fk Afq#ϛa$!qNCJ4bnvB @W,v&- 6wCBjxk9ᤉ ,Asy3YޜZ4ΓVYf'h?kNg?҆8oC!IMo:^G10EY↘H:L@D+dˠUHs[hiҕ|֏G/G`' m5p|:9U8PZ7Yݷ/7cs=v{lLHqyXR iE^1x5/[O6rpP40ޢE_A͝ Z5 om2p)lbp/bj_d{R\' 礅_}=\:Nb{}IStgq$<$ilb)n&  $uT{wD]2cM(%YjDktByxVl巳1~jpd1O9Á%˧Byd}gs9QNʟ. /ӦxbHHAni5(~p>/O0vEWZ nY3 cU $O,iLacoW1/W=-kqb>&IL6i}^^XpCŋ݃k-$pxbڲ&6*9mg>{rtD)wQ`pkKyt1?[ˋZ5NhfӛŮ Qu8Y4?W֫/&W˸~%pqq{% ?K~,#/0'NZ׽Kq^ėSJ6#j8GO[ PCbʍN^XS&}E9OZ]'t$=tnn&nu [}Ab4 +OLuU{0fIb/{&Ά+4*Iqt~L4Ykja?BHX!lk҃=pnUגZ6p| G;;74^l{Pclwů Հ}xcSu)6fbM/R(*ȴd.^Qw %"=nluOeH=t) Hİd/D!-Ɩ:;v8`vU~Ʉ!hX #'$2j1ܒZ˜bK@*`*#QA 9WykGk,8}B6{/) ݆Y~ 1;;|,ۇ=sxy+@{l/*+E2}`pNU`ZS̯窜qN8V ['4d!FmaX-6 y:1V(!L7,RPEd;)QϢ +RlWDžuF7LFֆoM~ar*EtIbW>jqour?qzJJaQ#-n`/$fhnqgTĔO5 ꐌSYXzv9[ezksA`<dkON৯s|&*pNaJه5B5H:W2% `6MRR'xZtfC$1aH_dx$1'/v^ZZ4`9);q`F"d1v>ժbLGd~MP%m x52LMF9 E"A,S Vo}\"X.2< 5FB΢u.`aJ#Tk’"D#cuCXȉ4 ՖK(KP|dZ1&8{9rLnMRф%V Ng2K|`ot.GSGd oE'!B'Nb1{8LW^9KbN;sö!`0ݘ/l+1L#B8U֕&*?V6N{դ}Y(INBKhx2 *MOenT.a~.E jG)j{=u^K+Ȫcv/w#MivX :)ǪCZUnAS`SK6OSxa3 W; K>窜̀'n 3u0?K@BS %fee}i]>̤+*l:\歶!IZ5>H;0)N.w7ߍ|+qUߤ^oå~4en\.cY[s'wSSۘf ?.D s}Y~/J[}jX^ޗ_-/̍ݥ*n./cus}]\>\\^'W_nAqC_oO-S_sOq?B}mmK2/@DJt}=xL@5MG0ZY,\S Eb uw:YɊ|ZԘ8'ˠ*>q/E b\ R%.aS qY>W Rlz!>Z.|<VD h5^6eM>y̆@ x>Lh!*<-lo_V684A飑i2#@+j3l૎S1@:G|gRcƈ?H(m>LC,HI~'.Op% ' c*Dp*cj|>z G` |]e*:nq!`{ qBAgPSO}E`́JPu#]' 3N+;fwt[wL X1!;W$*죓Ha-s>Vzk[~S_vD.yΕ`h9U|A܌ЃECTC Tnpצho!=V qy)U cigs^>sgv"4N9W_iI NRCǔd X1Lb.u@`X]nl}!:ViI[/SE un޷(ȊD0M^`MDN74Т C>F-}$A:XBgJWq&4ۓflq6TX)ى?Nwg>]dt*?Ű~{N_w7p682~ =WBX"XA:#u-9`x 92$4_>9WvTIj`+C2"s%DƖ|2H\2+AaTaBˮ}L@dr_Wfc>IdA Od[jlec=XJ|&+-T1m8NP$%s,ig\Z:h Ћ߉n!r}_\ \5 6 d#=&X^-kOwĝJO\Vj; )!eoB4F\jtctUb.L[3M8V|&jZz/@7aV),A[5TpUZL_?CU0E [%W%vl x٘3܎y,< )i7 Ո: tC`\?c%v7\Ct!$9iç$><+c~݊lz1H[E'2/clQ.I`AWOlw&5fH n`gMytdx)lwAK~GgbJI-tq5/i ?WǠr^C/1NEU<=co(k0Q~wˌ\g,\ rf\PUH,L#L7E"`0dq@zn~+CX|,l_B'9Dcuu|~z+G q|-bb^HcUha9ce1P[;qsA.Ǎ-]W‹y?ڕ^Pm:>I+Ȧ6' ,}U=̀*Eg.6_~OJ/8V ?ç&+|t><,BLqL򱷬dS{X6"X#-^䀕#{К4i̎'QIc(<ǩJi lc*n;YKOIcj{ƕ_1~lI i N6g=XHٺEU/N%q$tƧ&}}OfwzK m.3 怳DqyD8Y>YP4ڗF ި Mp#bݦ>9D~'mΕi_t8cbZ>{횿Dh&돎U -eӑaY3 Pd. \&3Q(D6"X%ѿgI9YXŨ:񉑒"aj&F]rA kT"j+j1WRo>XƣK!F˷eTN*V^f`,Tպ`yEu2e1XL]M-KGњ )UUjhq Y Mg03Vxq5R ǣ4U,\.@LlU % Vՙ_iZ7 =5_P =Ol*3 [FV)+Bov+_KPގZBxt2B6=5ύP|9O[۲<͐d*q'AX(DRVYB.=st/sQsQ݀p>͎j;]9-yK?c0ExVgח,y€X v첂A}VcZX;2u`v㐿l:g&GG#*oêULUm{~.$3JXw.GpVra}/Ņ J!f(R[WspPP~e/}uŲ.0UtLf XiP$c!3eXIj.@(PE'v:IBAy.\|)8NeOj6NPw3GI:%#ܛP[6S"b|Prow1.B\q.TSX72} ~|gyR1ssvݻZGc d@X ޾! cJD^#SBoԲb8?s{8Xx' `I¦M-Yt~j cۻdó}l6{&wwUy?`:ֹoT/Tw opb-L}OWBD}g톙nn_A v7^:X^$&lswWr=t$: <8ADx?@UU-61A_78^8UB!zN],[sqD eWJUX7Y|N "lVV\0RTt08g-_@$\er"B1E$$d!:o`xZMB=? P( A+}vUVC1Q{Vѳi&*.{ۇߎ qCͱ| 47uVXQ>J7r.PGT:GQgmPFi2ODzL?8UiD%Ef܈p&A)4hQY[Bb!=<8Joز/"~3((KC< 2:d$ً}x0 M*LզtRD}*H(JY=IT&314ͭ<)W ޣ*7x4/c&PFGʵ!U-YFϣ4S l5IQX0lC+*Xj^g[w5g,LzC`; h/3u:c:b|yܜrֶT9m)spt-aUYlKnɁ'Q4_= 8@% 76~l \9`D}u G٧zc,oF%ٴW tjFUa=͗T q"yqn@xo\@pdW[~MՐMqDZZ~|vmCHqy"S67XUAapCը.z: e̘qu ?}ފl{>uPg#ad[7YՐ5TԕM8G5J.ʹ!6Ծ;wiR`(L`CeYVH:< e`oKL9bm-o[u`S̱\ts aTʤ} mWvx&龀؃aQtS6 &zaY &ѭFfeidB ]NN'1v4kU̒-^Xu-(7,7l11]vw$CJ8=ͼUfF\/ȔOm1uy.0c&R\ +.Q:`)8s,}]lƒvIժ F25g,kx.ZWɕeSGL>&mm*7T,FdX)EjEu[UyUElh}7VO<`W}dˉKr1K3%քSys+tW=ߥqThˇ\̣)Ĥ6^H"M='=E6%FtmA,2>޸+ ԾapJk\=Ķ-.%CjؽZb7եV)L[=/QkXuu"b9ji$L vm_ẇ-GGhj`F]}+b=8٦/\bYznÓ`)Ħ[R.<:ًm?:ӱ;:{{{"LW@EۍۊTq7kúx`]IF]$YuB|OGx]*`UJx@X8.~xfY~3 4m5hq*j4 ~Qq/89ii%msZ H5?OwW@>|_fm U`2N.¢ EGCDhF37:{?0ӟ,}CxHZ|@*-[St֪괗URopR)=Ⰼ'hF=jGGc ]u< :_(V.b31&5 _1-Z2 0D%wA~x{|h8>ȿ-lwGBC1V;(9oP"+X; mAi0nwz*Pz빶r73٭h"wN~E8^;\4崉: lo5Am'(5g͟J%wG*B^;@ErD浍4;5^݃HVGyCimSBD(!lUa .,oC0@v^rpi]^-^nohm su;D(6 6 a(clİo(`Xek6x!hc+BͤXJs!%m!D!ub@!|06o! Ѱ6r KnӎPE5iYI~(&6d-5BUax0kQ |/Wb5;|ivٷ/s[g5ovLtNks >PQv|vhȻ="u(3;rGɃ1u8ef;IqljKTSW N[ "/+P^k<)`@-TLd z%SڻeM{ 9Ɯ1@v4Z֡}{?rAu56KqqRnu!ȪX|Gi^čm@4&>)LeX\5nҶ er!k/y r5e}6iӶOtX-&b㵝M5~[ܢq0-pR'7@)(4<_",.W@mni,w :,"XjxE/;g)'fC}:q0b^e^$)C1f1ܧXf")C4waXg$[2/R1v 3c k=ΐI`[MW4ѹ:`B*NaE9o¢"_" **`^jĩ|Oa/*^ Yn0EK@0-ɠGXhK|*[$#%Fo):Ӓ. CpePj$7r+F}7EUt"p+( KP.|;{.0}q6_s3r%Bv]E,&ёt,#GQ,ƌ4$0@ɡLY Q9~"T7gjqgč , |4GH`ߋ!* VDac0f9 cɸ*X";Em[x꩝ܲ2:8*u5ui3؋O(aQ% Lg*@4סVܧ8gg ]:ԢsOuK7q{ ⏼ (|&d*TT*`+ 0nlLǬ3|<{>,b7/`5<\x*JH߃1ض.d5$S{e “ ,*fXP?;O1Yg,Q۔5d0#ԖeP?(!lLy3Sږˀ_,˓W8Wdt}ؖ a} /]S$8.&orrhr#i2<@ ,ulfbqS gӓl_++ІgkH@Ly`6ߧu53˴)I['6Vjp(ƞr PB)MjZRmfA?\( 'E ck:ԟ eʚf.Q{(߳&R%>1e2ZQ̸4E4Fm3L4yh Jɬ.eY0Qln{+&'y < 1GȮ_v{GqU ęJ nK)k<<  Lc?kϞZ[(K5yвB_VudXQDO}wii{Q`?{\,*{|pGa:(uHm6s0p<75mԛ=y? l;|&"UzI<]ŝVr xo ?)܀elvGB*bؤInY0,"a<2V:= Y@3a2K;ɻd_{Ͻrm1,)?TMshޟ1yeZ%]M*6(CMFC+ytv2r]pjӪSq{Wq: r))e0QE3@{}?~^VIge#wAk*>Q-tEnIki5Mo K*HL"եj:6pK%:zHlGI? -"yAC~hƁQ6HU}'-jFI/VOH  MYK4΍GF}`"5 f}0Տ%OSC@ԜNäRi`C*!q]fpX2Eswum]B_ /ۡ& "EA6D2)AyIq2vnY=MHbB +.M>4cTT*ߔ k2%U&p"D$nA fhC1 ڥ 9@4;OaC_0Lgz]lk~Ղˉs 6EPݵ:>0@'=2P57: L?ig.sEz5=# ҳ NYWQq.0w@\8~s!,`\iU-|V, sx4 WUI2]qH<)(Z`\΢kQKRh 0żeOf-͗Kp|s۔(;唠E Vg}G02N,+t',N-'D$D+ƂBaNYEi֑LKV|[yB2mgJ{XL[]%~eS@q4FQDR=dCQGӜ:Է(,Y?=^ G򱨂iE/ 5n$Òd^!J(k)| P*|J=O+_LkZfv/q|ӻuzT7B|R#68] wO_8WB):67Uú },vij($._S:K21tF$WՈ`e-{H7T_lŨ;t+3F^8m̻]^wZ|QR:ZI, >2)rMD|wnjn-H)6lPQ JRK<;uZEUa.,cC "=,|H&BF+X7O W1Zo9Ygz4&ǤʛI=t. I2;L'laㄶ U>SZ+yӃ~l1mAZo$$i+@}!C6PD":&!VHI 2Ժ!m6[E(OJP-j.ͦU& /Rn>4&Yz,[3}d t} ;̀e:I}6]qlGeh w. ?&r{^hmŅ]J]5cuմ>O}1yFѩi(e F+r*_'݉\J8Kk4 #&*.B'CFY._Gr)3zXS̘J+в1 b2CaWzX-⫭V:fOC)߱^Mw8ۼ)"L2 +REC޵jFeUW? VV^?p RY9p1P 㧉50ڤ0\\Q)q>* ~u/|bpý:ERu-ޅ;;0Ã%J:(~i<( W^!Q#1zg@Sd7 dU'w1$?7vz2āuۈ[ /vl#sX$`DI`zΰr'.9j^?Q)yĆ/EOy,xR7q:L[O$Y?? _`41?#:xi?աlS+b Dv]Xp HwS pNmoJ=z#c_FsRG] #Ix٭ 1ʂU>`Hm mۮˆ-\O}F# uU('7xuHɅm^(kbDWDKRRH" h-y Yh&оt1R#!P g "i"WĜJKAaBԀHcM Oc\A_$ɇs{A3'H1!k*wR4%(iesě"Q~rjYS;@ZIIp>╗-@QX׹|pIǗ (SAl`'.*!W!SNV?cOłir-ANnjYMjDQ$x~Xp p<) ><{ ;C*7)cEMN蒣ZA|K^s Fw1@tyD59wesV$$\-J*qBmlI7Zgq"ĸgqcsGJ4:d4hg Edy?;Dv`]iKkxAZm#E贡xBseZU(76vH1g_5>8%\e gR(x{s0IBu,dW,MhmSI )Nprh'WrOE@KE'=K]>WiSSIzKub|vr>8;ztֶuiޏŰ;>v$dbHt']Qaggv5S05MPIzHi&a$)6:AzC"5uN% )܂lϝ['RfXu6.D ,iFc\384۩BMmSس@79^9.5E߼{`N &5a*,UB2 TbV/E譁B8*lŠ+' '-`OxmI !\Ihs){~rCR,<$c,Ykt Lp<8+w~6diaZsf3A5 7etJ<A.CC?Ԉ#>IML ˬhxDՄWoO!"aKZ:wvN}{_ Ky>L[cyNj*8I&8׻U.p@NBXUv]P;n(fLكsKwvtT=-d/WxϪ^D%vl>ʋE;I~1\VjfтDIͰדL__n9=y^ $U$njMkJH6;p&/9wqN yP4 =nb+_y|^@_E 'MWBn<ds %`nsT”+&TOfz +ςcEzl0r{3^z붱!K0'RZeZK³'32zRkR!R$V)Cϭ m%fLfD玑,($'ͼyb>LN-T$1QB0-ji"ii?au}Z[+Bˁg#n FblQ2ZWrZӊc\pK9X8Fa|c/<gLr 5}rLg3'Vv{e&CLx;Qyid:qH#e6uXغcQc: aWx֊"ͨ>߲əcW)a !&gySS%I/"fd(Nb}\e^#9ђ7nef#\cʱbLKa5s-;f zBNX% M⋛""d#teU.FemGfy"J.HXԉߚۻQ|P\@j3 UY GS~u, _g+$ڼӍ$7_yp<7\^,P" ,H4.M`A(i~&㶭;-59jlĹ^;>:79un+x)OOϮT-jZ<&X':)RoVWCȂ-2l$Х#t7NM̼s5k)O-GcOjOwo8p4ϥ xUBU6"ǦƯ E/(kA~+If&'h2ƇbD^rE..^ 0#" c%,+|+Kb>;prw .G^ 8R^$9^ݹp&sO3Uf{qܪp.ib@B6p+=k'sMξ벃rꚥM<7AMJHPb'kD^5dGn*Rv&\T*ziShNDpx?Ba` T0HUtj ǒNlk0ZwZeØj%bLe'6$?V5ޚ'{(bE-f߱8y ]}9)dʲgsKVqy0U}3Ҽɵse^ _dP~e 8Uo1Ca{'8VDk{=Uj RJ6MHoEbrVön9"=9K\_<:( $FǷ:<:87]ɲVD·C%@ Uj1Jt% ,8N&gKCZ,Pj\Vj?H$MhbrcܹaB=e>om6ZN|UY(ƻzUڢf"_?:=f;Uk"*k:kz)ag;߿lsۆNsk$r_bUlhkeCs?^Vhߍ~fCcoN$>YsՎ*q0}´3 GH7^ՒHW֤NJzw:Uޛ4A|$žC Qr4=GG&`??C-Oh] NO'P,.J.wf08~[7$Ů<$R;|2rv)go{@jfI6U@NJ >F |Gѣ*Ȝp-@3gJ֧&_ͺ&+yq(?_fQT'JX*o̙ԏL*glڑo.ZsDAԿAJz:IÈI_F貕1kGK"^͟F,U"թWU*r 7.)h$q+)1pش&&<ؖU-K{[]EΕȕgfk smbEܩ!0<={xoUa^Z6_͋35'<{i F(3uŀed2F˾JA A̱x̎٭ޜNL]'c;)\=ܣQK8pUql{wU\/\\mLz;jèoj7nweCKUTUo웎1î|Su'?Nnj90'F48b&f$ـS3VV _.g gjn" fr,DWyX7*dϧL;y9'}1 NR+a{^REt*uSf@ZpVW7"%G0#@HHyk1OLz}%&91(HZVN>\H!jThx@p"BF(bϦ6fEPA&BInՓ zk 5HQ 0څX1"t5-$krqxq8l8klB5HWIڠ J٠68>)I(.<%D&Li}jDXB :0mQ q:ѳ2^yѰVsTb:M 0mNf{͓VDRS9VP@Q /E}Le)#Vɟ<ʮ$nrR''ط?ThxѠV:e3CS'E']^-LӒ@x;)~=cލ?%^1VLw'_o^~`xNQO ey w4)DETɇXR[[g7-np,'RaeWQE;Vjr4]S$\Rqz"@p/Llj냷mțJJf%Q6P̽@q8\P5>zz% Kv4{Qբz^HH|#h:<b==k` =S͈>tͦ:"R'rmq  :fmZҟ短Ϡә]3FEܡOd {Mݰ=-9J]oawh@4" sFJGvNyUNaѤ) J}NCv˝ߎ읹_V'qu6z!='$UM!q\5'(ۜnzx-nqSr 3FNǛ12+FyЏv i@2KE3(B:`b / yN wo[_}񺰿d=9u~[zpQلյ|\x]cp-OTNVi+IWxK۩'{:ڂr_j+)'ڿU}=."햝^FFL%irf}yDRɨ-G|L 0[מie yi+ SR}qʼñU:Œw C$J>u%D6"nKen')upǃvaG?H7d?EYd/'_ i|n8҆q5rA* J\[TK. #-)8 EM_Ytmg!W"@m"3Luw>Vp+_[30G3V5iaZx΍ VII`>0繲,ZiҀH&KkF݊jiFkhf,Ȩj !Jd1V^! Fg0S3N֌Ҍm/<_% cM,Ӱ=dVsG4D@sE[1\ .:#Co @uևwmHT:jRURp#VM HOeAMA.ݻGl[ d9e|BvP\ݶGEfPU]Eaeg_n|/EE }Cdv7~p2P+ H!9 {A\9f4:]sw}{;qP[?\[.\n}ߊ< T&7fhphb\ecE2@֍7kIae5\qdE3Zwέ_]Я.`j~NfׇׂoztE[@UwEhfOɸhk/n ߵ`1vbe96ay&h4Km_W _t>s]rIǘ:Цy\~ƨR9.F8a7Qu1[}䰊B/1]q%31~M<87؊+vW%}BLiib1%_XM`4wib%ia[\X@^F/M,/L(+^ o3Wʡcw2L?4%1M\8XSI") U/7 d:TByL nۦ F&&TXH,n\i@PU>.?8(9mU`E[D_kxnP׊q9;@MTc "pvnRl5@_|v\s 6I{t`9&9(iAK爓_t@0PA0,`(ɉ2Ӄ:"8-^cjVke`z#1kinV@YފSwTFU⟵ò|'R;w09𝖊2vphaJysac98z#FzaɴBCh'Uj69\hG 6ˁD&Ji53mk#V s8|nF \"m .iIYqp\in.pB<1NjQS(sSWHeCi!nֳ&cO_mPПԨ/\W: 0{&6I1a(cH!f~ Ad,aJ\x-fl ͘g-e@`hNS6(q5kс+ j6`~A-=).oU%3Xl·;ITKQ2SzLY^_d A8 {7=3z#<\Qɂ3^^;E`9G߲ =('{VX{k5q59XkNZ)mrKW) ə` > G&_qR(7FK$uZ$űT/Hy=SBQQ!X&ќRim(kCI$w $j2V~sf40US\-4fF{_= G\! {AͱGc*,:0kP.=*@D#YX2E& ?GL2V0h$9͆/:7#)>.3!URK,w@|gk'H3*q@)K@+9 ">(@ɝ 'y\ a bA!4P"c^fD+mK`6F>fLư(HI)I6]"f ģw|d1%{+oSh%) FPz ePXe 0Bt $E7|B (.B;KZ:q!PFG qR.I͸C" Fif 52TfG%B,53IA({B2y%X" !AI$ ` *&N+ bLFh-\ZΏ<@`;+6/qFhfw7oBw܂_EccqEq@2 {MGwX0Ė˥߻7$>C ܜ\x>kPBCÉ-u4?:U%үf"JӔg0Z4Vyo&w}b; qLRe&{2~ZnZ֎DG3*,fTasըT AcIj CҊ0"i-kG^ݧڱ}g5E4>0A_,CW՜X.'Μ5㽅: V0›`tj4yվW╭(m(|4t4_3Bx>Փ|g# ^iw}k*䙪4 LS&v}p">]2W@U8~R려A &9G)%y$`ޛ'>4u2Χ%ZdX|jh RNxuZkQF)KgXe"΄6RdQy#2e@ &3A0v@g4´&*rT鱯#T'`Z3t0Zo!s=#$F $#LJ8cGFY( LT0l/ĕ&I™QňI0ъ%Y =֜7ٌR~MHJ!TZL^3*!o{N%O̬(LD1 >_("34NX18Jr׀Ay P' wj)bR"UV*"8ƩH _  'rnpjA u5TT$*2 af,T>hNR8$9P$Sֱ&Ә'{)տL'^e]d`j]OnY3c%|R$ x{­9!ђyM(`2pB".-1?eQgq=Bp0)5 \X(@b,0?y0wI\^O[T+40Dq@ F* v"}V$V~0 'D *` "UD,+_L8H,`Q0 W՛rh$ةxRHMKG PG(JemQjV  yLgQ.rlǘV -91&;v[ V4+*fb6U;&_TG˦n`%GWBURZqNb12>ÇM9>VQX]+V2CDB*~m}0B·avl *mNt+ka 0㗭S0}ױ.V:?7ߴ{7eEyZ^_0@JL'7f]d4~djWI/=׻'bn] dmԿ7ç1RC4fY,fomb;4{v)]aɴwJyDsq7]ZǸ75{zq= O('}cFŗOL?[W"܃&d~c~ɍ߁ua%LaCeԲV__/"juF_~-zvޤc0fl?Aڵr㬏]k]{$^ghzӽ Fxѭ(^=' z\+]7(_^oG->u7'+qL0c߹Q;{ @`{;0{Ȭq* GTV<μfV>̫[P4Fw|@^E+& Nj!Wa1GBQv\bETՈնn@}flSʲIzW Z%T,h57ȳӬ\sN(x⎽|XIT졜̑_Qo _Խ;1@~/ߍǖ4فo-<~'QwGĻv1gym>G>h{Z$b3? 2#P+ {6aTekScPJN:>~|W|W-t<%w?A,ڝ_\ -^͵_^὿.ܱA㸗fmA8' Wh0< vl%nEo XCU؅RYS]T'9ۗvu9RcqR ^|C)viub$<>>9Ϳ &Ny45oFh^<| ۈmhX衔ǯ,3'ߖbi5sIYZ?lODO.r!8> .B/MMiy /cyDXƯ0:s {6كl {!gE6!Żj%RUdLL~e>$51z&xC=NRpNk](.Hp9C;Pig]~16LH =oz&^CK^x2K`*c8aNBkf*/ZO=::K=u{9*zMn1(,7cb POZ6-Mp~`3e_^|Lj\nt(Uil|etG^n\R`Dբ mTuqцOUyyYLN1='`]-6?ԛ -c%R8Ԫa!Bhţ%~wjqϭӊjggzSv{+Yrck!8"O1'TBE],&E~m5'K5+IO%ڢ&X5ku!dQl=osD X\i^W/\+7\ xc@8 K.1i֬BRbc܆gϟṲO]&}If*-q#mنRIM&T <*L)=jUD-8?#*:7Ko$'HL0@hsިF \*HS\@7P[/409=l 95%NcVݘhO3;qY;.fo{< ^!5$E#(:`˙՞bVZ]RW5hPݹrn ^ʨE~7*Jb2~hj7EĤs}iVwer mHs!⫊Ŷ>9mquPmtkƱ;.MPUUM2)YWU5rT3PU>. Um̀MF@M BGM?WM]h6Ah8A((@F%RsRRI ohoC{CZį*EH*P4|U`ztC{"%djJM mhoWpܤ\29#%eLf^0x2΅&3PU ƭ *K6t)sjʩ& *Ԇ-PR1ggK@hڠ!fYx*R JJbՔIM,ihoC{2N6R<#%Tf^ƍ 2ؘ᫪H҆67HkUj*YކH{LoTR%;'%Uц67Tګ2!B%Xj }7LK "WhSRCY"Np ~%-+=ަғI:AozVnpebw ewҷH{6dwk R]u= cJ]j.Y]*Mn"p ]$c%%c$`"2kh sHH $qkrceҕSOnᭊ`4uxi !c5iU*5r| CUY."ݾX4Y9xr%g2Q % (L=lةݍ2AB?z_h:´q\˽Ჵ,J'oVYlWc,P42% 2B;Fn$AvE*C6{"H678#c,yA[В՚$3zfuUEu.uh@xJE*`ɠv5hv:C]ed*RBQՊ?vM&E\: W%MᓁaEm nGFFNB"+0Q#8ad8!y}_9͌T';MhBt !//`a{`섌ATFr GȞK؝X98_){Tl7? ^i]'D"Aad0 #HL_i˗#"K02XҊiŚ򂊔V3<['J+_}V†5&֦*21 )eIp=)>B\fr/7^` 2s,T]V!yd#8KYUj8' A?F+崎BX@k)Y;Yyi2Sxɒnw4Ruٶ=iPD (@|Ge4=3ѭR+$7 ` ȚJ'VBpm‚x7`v{ 0FC ڨ4/L)|E8[男mՊn'.#Ñ>9p)dNUVf9ԽVLX#UlH_h,>,22$<q+ աĒ~Y태%^b ۥ=LaI_ĒҩR%}ƫ}z֬4\>[|7^=g1r9d+*yPuN3#Q-o8Nf]YmKYēP^ m@q0Q`(grӔ+j"Qt{u{}Űȷ`%|028GZ_C?T|y ؾ -sm4#/FgIQZ_ yrWSc/]#{*56XAQ42pvOQuNNGm;K0fc" 7g76B;9oŝ74 4ڈAQF2}ZNMlb md8l}wkTDWFWikKæ2m8 u:S*f-P%$N28ܫ*q8 SL8r`Kid0ŘjmcI.rIM&ENqI0 (WbE,j'1=MEauSy ՙˠ%-]Ĺ8t `6˫YN7e;B͔۳3}md82g+g'l^pYw .`a?2ete^?Dg7~[?ʿmןvn#^2n"!3*KYgg#Ao3[cۛEI8t]*a ͵ڱ')_eu-\WڻJ{oM] _˙c51ڂɇT%|26Lzt䳢DZԊڄ Er SNO̘Jc)yf0RtDP21 .=qƆr.66ZyM+8m8"(XuXOdmsԔYNeP#P\(@eD02v8qGuX4YUi/QC[:.U68v. ,cvFBƚ\*R$tH%bCyPy;2cmd0Ȑ*lL6y*9 Z;-`Rq22@IN%esqWnBD~{ +Er:[F-r v=ѐcK 7:t[ֻF;6F+ {28RmAO*hhڼބ lzw5j>ogW]vW\^O>T_Iv=_ X =i ~V|Z~:gǧ_(AӼ_Zu?+>!z>)=Jo&{J{.~yWew`&ע{o% HV[ޛװ$шxhq H/<;ߠ HkUl\:ek[<Og؈xQFj8f,0f]Z5VͥUsFED#³b='$Ò/hiq]Z\/Syv yŏaxmԕͅ*LvD]q>Bm 4ǣid0KvJ`uAZL,hA jNk_J١T_H]~Aܒ:@lk8q =c![JIMߤ9,[f1 (Fozl (@K]Z]@p፧4H<. AkWhw~ yH@yOJk1Uw8#Ñg,a,I‹y/tR̒{ <8oLpHKE+< [y ]!i4j(`:BF#i9 *J:rkFݚYQq G>yT>0?Suf6ڱ+ v^: Ige\Q*I4r.؜r] kQ\42;[eipWKR\j$]cad8f:]e(c, z&h me`)e Fϝ6#gY` !9;p1d۷wUmd0P-T*Uft *K.ʀE9P9c{q^HN+egTTin}~i-Z֨t FN\!=gWlfES)yk_wɶv>F@ kq ʩBYOVW֙ly(Fr;{D~:qe8P`Z2UTU)__O>Ƨ\Wl9[Z3 [`\ :+r. vL;~>EK}(/ER%so~ ĕ]`rڹDZLL$UeZu-PKu ze ~ߍD Y1-*LmuyZw`(y6ƋgU{mՔ7y'_z(K, Lɛ`r=[4 XN~I>jk^#.d+m"OLz& hqlG랠I.h**V M)kx̬aEYظ:Ż_^_^ɘ{EEu>2q>9EӄHoBRpKLrUڳO9.ByLslOkZ@zC_/vf֣ZElM@4"NOk yCeXfc4|÷0bt{_OD^7oyov1Od^Bt4F8bFR{[og< Is[mDBm1c:l$ҩLdcfpF^7 V&K^"4>ڼ9}IgV7Y.w*%W-W+2dzw4Ӣfz-Wx}ULЅLc74`z-ݵp~IO>Nc O/ K)oRNsm3leUo'hߔo"8C 4a&Wil7'O2jԄ8Mx ng6H1oBc67G5}WLn[?dׯ~$پ>mx13\ 60[=c'$ZiHqB-zY4A׫ɍ{.,|c#^ҷY|%Xl6@o>A֘_uܦ[;Kv߭9}ݽn)Mrt5\m#wS7[Vڡzm%}Sں&6Mjciȩ=NA Hة=fɧĩ=dq6 h iq;xa?Ŕ5PA)\; W E)dYؖʖ@gXM>f`}դLoc> cv}vR-*h+W>8:> N1!6y YΗEHP͚NOF*윂mj]/{_MWar VVWT@wk3ŧ!-zqjg{ֈ揄²W6_]FfYxStz;C1\vB.a(%gM=g4jO`'k.b:e7[םP:ok[ Avu+9ɫXvV.%IJvyQF`LI`P^VR;+4ƿ]ؗnl̋MmGEΠy[xz*d8*m%)fW?{m俊?`%H;0pY+)i=#ݣQkFiC=X@;z(۟29 ܘ?׌+OTX+ыCjTc,˹/jXNuSӛ6PkrX )n;M|>K1sb?R&̋Rqnw?XW)5ʳʳBӍ] 3^2 Zz3yB[ٍ-eח [GDj<4ޝ~twM.Kq1#UٸHtoo.Iz&F 6NݥhQvA~prߟu&͞Rb$'±w u9,xpiHoWb/Q*5ޕx,G܄gų) ZZer:]G%[j_gRKw4466`|?Pc'Zo_ς e`-Ya䰯Xk! :$*g3Ycfф4n[zC*0)e[9K$]֜|}&E㛜8<ڡ>tFɯ^D^4V ^xV镵'Ֆ:dY0PSv[q뎏Gvӭ#l@#Bsri+1%cB 3#hХmuAƖ͸2Sp-IȤs.1?a(CrzD)vmSѩMZ#>s.M1":)ojZͨMꗇ{Xpvfiڜl̥Q>mԴմ[,>4sE. WMRMI O0Cj^Z/텏# *g(|fcn;o+|h `W>ҊEk[p-9pϟ TXVbI-NWV쥣 t6t7:].<*pT[>ʪ 0~62؄T=H5S!N G2\O>;/3魘rd9*frO50r=[78zܭQÞՠ?HsЦNc/aٲOz2yʙ]o@B&jݱs\[V(נ%#PkP `=Yў qU%c5JGcPjTJ1Ɂ X X++*jA9b_㵎$+" E+xZ56ad˥swyE nTPdqp\؊urܮ1;NBk:= _C}>XǮ+XQ*.ttڂuCA)^H% `%F*@|&v=;=(8S[nlՈW ^R:+{A.=Gx$CmzXbrr=ud;_ 䨿UFX[$vMӍw'?i1ۃlI-|jWgW{։W#Nu[#Yc_=+=S4ZLbDZX}_JzpKO ./c)vر{/EvRc)X}**K%QPjMTVX?+zI(^L6 |:Š*tBo7պt{϶ BalDe+ห=Sд|*3]`K> &WIk %cbx5%2:0|Ti3Kʡ"XM@7R5eR#om/ܯdB>s.wD2mDA<<:Oٔv +[BieIHM:q"ZCO dh?Jޓb%@زNK1ѓb%@)֟b(ǎOn6T X X R lD4n+ia1"4_Ap~@V`aaE({Vr'Űl-|WV^upWVΊ9d~ 07)羱_picO S=d0٩{Y{`pC鴰싗g!P$x"F<mv,'.3l*/A+8;dR3h.b;-VIq - d\C+tAM:dV lŁ\jKPRX~ilOVŜ`Gs?ň(syt͠l LjƁl\+U5hl fmO5۠ުY虢,29ZAϔ=DpۍLvR[^N1Y@sk/gS/Boj׌OF |x=Ө U,@50%#jhö5up ְB ^5XrѰ_ö?+b^~"S`OQ$ςv5ӮP!8tPGMiSuʰяZ `f쀞P;A1'zfA=b,=da">eAhװzB k+zFivlb)\j*F@ke!kj=Z(ظ{y=#D`[50Q8pp\V4Cw{3V@LRUD35n辶[(X4錌~/iDrkG0v3DLD"V&?NO-&w[L#Gy}&)Mخ u ! 't[LB"о)q4-5gU49z՜,=O28?)=:W`-=gd|5xhMxQiW6)sGMubKw5n_vw Aۈ=;7z{y׎&3]<#!nK%FSZRD c -;ZX|ضҿW𨑹$t@>mBS(ΚFSv:ci*ռ-Z(P'zxUW,VKn򟻺!ذO>Ww箎 V)ovfhK'7e yuӒO6[aiojs~ib/-wM)]JQ;#Xq߷yxA~e/шDQuQ \{A%p?z\C?\6/˟IyjtIh]ٜ}>Oet.5!1}<  ;&\84'G(#C܃V*|/Y*2F'&~ wRжݧr.}޾wLà;G),dL&{Mf:'N 2dl9tԪ| J^Ov=%C14]Y $n⑸qLSE32ƇDM 9Ih ̤\V[E\HXoa5wsr!=23ڸ<@ܓޡ2KT6AjRPVU͒RKY#9Hfea J ѸUT٦< 0uUցKB:,gqR9)}GVp4!90 %ML@RL9Bj/%q `bh5p&# ZR@ت^ "a$(gt)5jd6 ׊)edJY`9NڑT.KIo[0IgN-Y4:ie2$6JI02X 9YQ6El Z+FSC6P>d)[L譲!KT:SDWQy )a Eɽ%kLX2uҠX\DZI!`Z>`dRH&` tivyV61{ aL$+`A3iR$ J:"J8!]vR6ޕ򤊵ffB!i[( $i!*{`VO[VydB}Ni ,5\( ֱRٻ8Wq*#K€u~l^싱E}3"{(䐜Q@qf]]QcE2lj/PNJ;ID:Sv?Ak7(w̺(N'1!EEB /{0g@"5ZT^S0 YƶJڀk< { 'dRe@YY'""[@q;*x@CK {eS Uߨ}^شXżsY jbZةy&?Dn_/b1di*h&c-Ve,!z إ)G7^uD! E2fh 3w\#L]Ee`ExZ%\-1EkA3pNJ@` !+WB`s`6n%+{˞:V lp9@ܔ#ĕc0 t$@fjZ%7J/12@?A j!o<ThV,0欻J 1#t U]@MrJ8{-iVҽ, I*@5Z2ʬ$!޺;HR[ɣ^P,XPhQ*w(aU50 FP_ۈIK<rU' ]ylu w|۾\tY" _1tunreaUBQwCA\wfU`6} Nd2v Úg6$Qz h횠@ƴWg #."}ҌؓUÁväDlwPr5\Q͡ xy܈ADD+9Lw*T`=B Ajq)1J;$=|XAA==`T~a'PA>mow`E]1("^tvmk~VYo{ˡ +.f<0FdQƈI67렣Db{n\O\Sc QٔjqjLkܘ'*Yy&O 4^XAlZ՞ kI :Y2K!2B򜽇~BYV3 G#P@vPHy "ĽNZ7 9lBZ4qԋ +Bjz~)Q32P= F*'1r8,Cɵa+5Hd~;x "Bi868f\*w;.(1\rAu* ؤ3|$#PO]꾋H5@λ&*?!t+}@FѳW`j?&\/o.ZܼZmKa1q1ڀcB9ɥM~9V1…>#q yr_wqysvGR_ K:;ijzUt[di|rs &gmo1^qHo7/ ' Rp~Vk9R=Mʎ}X+;>yho? - z2|Qn%IW݀ߦ6/^muK]05 _4:{_*&xx޾xz$?g<9z9L?g}1_gA+˯߭.}w}otv*i_$?К:]]fuuvn،~,nMߕ.S9p1?}EBDolx[2;DIYo3䤻9J iL^}$Ynqm%ć{bk鯷ҶɛtQ*k{m19250y}v:)!25W${S5hrVs#UcC(})IPی3.)lު=Sy?QIF^A`;R PҬDksb/*됵Ud1 ⴕoZCVY*Zp1pQczթ)EEe4L:?J5K;W4|) le\NLʵ`kڦTsET.mPdeb ٝˉr1g@x 5Fk?jsO[9e]Cn(% i#m.mlr ҕM":IFܥcP 3S~B̹4\h*GΡC㜽+ (n]ƒY=!<ԙ>1iwmsyJ{ʊMq3΍:fDyF999X|#H)ؐq9 Aˬ*:Zi9PJJW[I[սGQd.3 <4v+QMq(ƤS#EW'qu~0`Sbi=fmDHKgBkssAVCI DmT%/-"$3t"mF,@jTu]y|oud [UCb]0֨|S!ECuOBE+ܔ3*C)b!\CInAkBhcnuEїxek6%m1cZ`M\sa|r`@[L.,(Pհ֞b\MA%jGU0(B8+JaSO̶q#R8!o+]%E jmGJil2k"#!}Lh(pu6C쵷:2J}S:NL%ezŕ f,XM ɴ_FO-V0!Юgd~jK\Awq+~8d!S lua>${!ePDdEWVZuo+F-d֜M@{a0wr;TBb D)pPR 3gVTttq2Qڃ/bUm+T gv]mo#Gr+I`I]*`q@Y$3`Kѯ^b%Q)-yjHQ+R"܌&n/`I3UO%(.0vEVSpG-oV&)h~CBZ'R`:!!u̐]لPKk\\d8EXpp͵}%T} Tՠh_t$B;)p䋂K XO p1v'UP:ּ،꼵 Fw:~unvۊ v)#\T֡:a8ID> _}f|i뒇Fdz|g!'6x-S"!5W WMK?n8@CN.G[` myLA-ELpd GZ\5EPid"ȫ2FGs Hv}t\?=ɋEK*z<-enx xGWS`չ2YЩ ֏DE~qgDl+ɒF^VNsf'#N*q>e[hwYD5jRYLe3HmJu)Iïފ{) pOI:hĿ L%d_nIr2á Mi7X~ kyOVN*pf aQ7M@L3f`:SӿB ̢ԦYcV5 UyY-:p4v@Ƥ9 #.WOTp4#BpaP"D6P2%T͑et#`bs$ȥrÝ2"XHeBBсZ ?^i0fA=` F=N {c B?oDԥBiR#O E1A?$wV^Эd1 8-1@NUQfwsT`-и'GQaj#Er pNr*pcr(c&w35~ic5*Ei L>{ɵU'V !Wd $LL'k?]Sr$q|+Mx]>7S%V$p'B<A  VPpV#[TZ8pċWF4Sq#&xI#5Th5*H=G!e(T4qq6V̯`x@8\d"fc)|Ujz.!b}N1 7K`Fq2BqBE h#F !J׸tT#Lj~BwxR AO 0P56Kx!xXz)6_z ST8S^pJQ)%Ǿa$W?S(Ji}OaiJiG )~XzM) f<{V%[M#mVѝRWՇmC 2׸}Xpw\r}p'^V[bW*s@\s/x%uͫ/𶶻e< S0Bd:æ|f*BF`}ldK:d]Ӆ:]Ӆ:]Ӆ:]Ӆ:]Ӆ:]Ӆ:]Ӆ:]Ӆ:]Ӆ:]Ӆ:]Ӆ:]Ӆ:]Ӆ:]Ӆ:]Ӆ:]Ӆ:]Ӆ:]Ӆ:]Ӆ:]Ӆ:]Ӆ:]Ӆ:]Ӆ:VY9%0uxd*0 ^PG.# u2uNtNtNtNtNtNtNtNtNtNtNtNtNtNtNtNtNtNtNtNtNtNtNZ) u@JMïd:W/ʏO]?iP uP uP uP uP uP uP uP uP uP uP uP uP uP uP uP uP uP uP uP uP uP uP u^P瓕^{vSx+AտZr,W_=+2^cP׫5v]ydcsw6JL,oiTnO. XɯD*'ڣ+=Yk6LD:ozo7"?ޑ+3&VhXV5P&ߌ=6DuA&+';z\3{ g"9`edD`Km'V$l"`MpH&Me9N|FgBMdr!L2KB1ՂhvXpZ7p„qg%'6+\wRx-ҲJDJDj)ոk#SZ +8LpNw.'Xu&`43 i]Jj~vFzbV%D:z%J˲WKKgBI#i"`c"V);,<27VDj[cVjIϺ.-$~9`׸zbV׫]\]ك}Wߝ?V]0kWexŁp#*T܁SXw`rndڥA<WkkY3`y'Vj*-K佛 XMadE` cVj"+ѮC`iiav>ҮC:F2/n* oK uR׿7')k -c ي?UjX&L%U-ɫTI/XG6|(?N/1vי|3[Oj|t=WX>fu4j~Zb~ZⴿWf^,n-iLtL Be@՛o?96tRKMUS%`LvAO''EOv] ;Ajai*IC̯·m}~Hպls =w%1@_O%r"M~*`gBL`+z,,iF֫ +TZhX6S1. Ҳ` ~W F v;F=ܞ9;Y eO:U{T9x88iINny?fއ纘y7Q ;f8zE?j_n 7y]ax<_Y9?gw[Cg&yjo/Ͽ}T~ߟN=OW0ηz8-k;a?o>=cN:z8:^^.r\C䛻B\mw7ptH cٞ,kWãmOX_5^&Î=IVn}[C:!vpۓ7V;{:ifW}Om*h~}0>UvWʶ3WFpE 8zYvi8p/ֻ6?#7HLg}A'>B̯Tl 5*ϑHߪJ4\SR2ٚf}6sUĕvhP8 aPmc[t~]߷pU-,D͖ov"hHdQr. XT-}KrAD I4!V&@lSFS**R\I_pamQ+J\v\mQs0p2KZ2hNk{z5 U^e(`} QkZ(,݉M<Ԑj-fL.Iu)E5SYJ`R:L:V!Z0-R k0эkDÚ1)Rʪ:,-uY 劎ƨ2Kk޵Y<5Yyx/dprњ.dBϴ 5d/tW}]\\?,U6':$Xy )S!JɅR\.Vk~s'ɣ*K'jjڪs#DE(QZY 'ׂLQ 3_Vg q iZ7u]t5}GXGWmF>EϵSJQQ.)ë!8VvNҮp]Q (A,Ɠ3.ZhCD "oe\ Y  3,HXv\ɣ)(`J͕RljDt Gh\]mjvn Wk!wT3`:T˃l< dnıD6VlYqG̸дK,8ZA{֡ٻ6,Wa?,R vfx$6sEY^lYjyֽTשiL[~c֠Ȝp.EHt-w%A`K]*AgHz|`vBVSd \"+dP4WL% S;d6I .8|Rqi(gCAJ}agQ҆7RH6 rABAjԝ2Z.-%+)IH}^R6ʴLbgV UQc%Gii/E:3eh Nn eXxኵ&AB3׺R3&"h~"0HsЭnZ1".YffZQ$8)ƀ/%vNC!'^dߛia;wiLФ9d]F ^1.MNV6u ΐiGHBK/,yi`Vio9@Cf NziT &.H$IUzD1L<&C8bgP~B|^BBGą1tm1J#H*HeV**C,<@=dhEE@YR:[Qx(nhD ज़B^,TG@''GbiV5SRe%) CdD~ƒFonfdڸs6$`QD>b =ND@6ǤkƥU@@P1@>rhlu ]J@7!voZ%tMކE@L1a:ࠃ s^ϠB.ds{46uQmFZnۨ44^p (Ф xd!h]?ۣNId&Yb4QMARQH@)E+YBy+ s.pYTe(!cH>K4ؚwrXN3, B5*)8RY Dy!mJUΪeV +Կ^n%Lz HH/6פ%k*mZ43di1nʇ9Mv1M&R&AK ,@ԵEnBÑ$8z`2o6:Sn'\;5,zw 5$AZ\)@  duJ3|%<%*`:39P/#dsSkh+{r).*XPk7-+/lAWc@\ϾӒzyi«d1U?aaF}s'7m?FȨcJ5TN5TN5TN5TN5TN5TN5TN5TN5TN5TN5TN5TN5TN5TN5TN5TN5TN5TN5TN5TN5TN5TN5Ts5DQ1\;u7DѨ8cQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQG2ܐ: >u1cz0喝u"xըS:ըS:ըS:ըS:ըS:ըS:ըS:ըS:ըS:ըS:ըS:ըS:ըS:ըS:ըS:ըS:ըS:ըS:ըS:ըS:ըS:ըS:ըS:ƨs}廽 m/WSZ Jح~/kݯ1*R)yZApn5V _eN.Wt4vt;[=mB Rl `!1Vy` kiX9*@B*a3- X%@ҤJJŚZ ,4Lbz(`@zmx<֣džkXZ@v @*JjZ>0@Z*`~ ` XohSn`5;`evb`Q2+f^ 2e `A %cx&/?k7vRr6'Ҹ_F߅,FgsZԑ5m[tk'Dzv2Ce8i]wσQZMX 񴛔g&eDēlRv?jk9lR7)[}nRw[#q;tEpJ"Z?tE(mMW_cYJWL\YtfSNƠxYntTMJu~k^Lb9NͥߔYAm4&Mf9? uY6iWl"soq/d=un.0)sUMFn=倸pf~Oe]-4ه.TH_Eޫs3 VGi"ܗ3݃ٞy~2϶H|Q'.~%[ k%Ղ>hv|v]XRuئ_(_/3Е^#A$RM+hdPMrN]<wH'{fD+ieӳ-kQƇ6tehsJه'!#'}Vϔ:@F-{2}|kfxs89lv?ٻi#-9d,<:{G "J{*γۏ~q\0 ϐ"j0Կl^zM߽܎%^G(Vg.ʹg|Qyw+Ǣ,SD砥'G1ܻO38u']W ǒ}7/ƏOh|>Yj7t2]~|GLj[i=6xft>D:^2gte8;ZBi{>YFgw~.VCd|}GOw70[ lޠ#rHq/_+n;0(z1=Lv?iGNIw,,u2vм|<]6mOyt Q{kׯt~$pt e ^Z‹n`b5RϧËm>{Yf\~7n)cX\]ϧ'.(4׼/tqB/7 OߧU, blS?^ VG`2Nagf£?‰a<ژ-ΠhnHu$&i+5':;\8j_\e4kGUi֖蟏]>er/ ͷK*Sv}Bo|`mVY/q: V+8f\JM"43au\B#=e?}&liY$rsB[0g9$EITs ߹1ܘMMX,-3nۏ/M_^אkZԶA`M}ɵ[uFEUb^=bYy&˲Ъɾ5~ni*-$S2+= o/7&\h)(8koN??4HD"H 4E_S2瀤.%+O#$}1"qeT&89N|%c)x/)>^_pvT}IU $"q?^RsPG|kagV䯨r~ m/ l)$OqGB`!o oT^ٔ&>-v0%<ҍ٘@ 1Q1M%=Hgevt{'3pNfߓY?[lrb?0Qz A9Ac$Z,׋ s(JN (^:zв@sp}JX"<$M$HKD='G\Ot//aCe+\#8S432y ?~Kk*Rqo%48x;F˨4CllTVs5gyyO%pPl)Fx$-!"\Fa] @#Am:=N|OUiޮN'܅ee?|1Y=Yv,lq*"K}*Ȭ,OP;[.eqI'/N^ w0Jʥ@:op]\}# V˝J1?FpN \NwxC,'q "DڿCef̔ x-;c{<Sh)Hq,yt$){A9 'C8q׹}ĸ/8epsIh+8Ohm9ÑFw8>.5`u&ǎLS܄KLGBvgo/roR^Lwï*ǚWUw>k[GרBBXxYýD( ??'=1c=:@:e4Ro 5rWb^kŁJ34mD)`ܠT8t9gKB4Dܞ>~ۭ Xrȝdؓ |,|Et#I.E'FP5ML@%O[ Ubь%4 AB>(8 i1b8&98,OdVz@BX+x0txsSXyTTrSl A8rF)֕$D)4J^Bs$C- Y ZSo Eۿvp07&Y,bp)5m?CџrIQdGRdC6A9a# w![ASjx+i] 9 |}kTFj `\3o~MYEm !it2G?^Oe9yXŴ?ӛ^bَMѾݽlݍ?@9"NA唐܌eN!r/#ArW\;@ /~Ԉ$Qq6X˷|%C 24*DrW׼|ЧK|%>z{xG\R:c~M7pjwmv͵s {fw&Bn_6>"!$',H2A ]2)XBဌyib܊ ,H: jG{ HG:g>^4-f[3⡬*H`u'6~Dn[3z|A&l eJmHs28N&|fgo1}Ngn9[p@E0]|.k<$Lf3>t>\Quxho#JvA5_Qst;Fb"&i+A6:uS%RhR>0ii9IK#^cc&s-4ssC,IϪw=`ȻtmAT3Z {*@R5}mLǽ&&RۨFP@w@Cr Eq5boCtX`q:cdHmHѮt\Kta<ܯaO*4q7?SMZRoWܔJ)-k5i dT{#b0.]V OD6# Uxb 6JqX[hL;冀-|*g\0QQt@ٻK,tTcyb|SqbhdG\ǰ΁cr/pkGSȦd%!FOX oI; g`.V9~Y.Iw{H Y9#TA61Cf9@i6e-T*NQ1&7#{! 2WP ` I&E & oq)^Coqؗ .6m̟_= ]:nNfu/1:Dö n5m\MB&pҷBcaݱl ,欘$~B!34*) mϙ!5ꙶ:7`UrM*'5X(dD-FYn˚|f9g)"PsJX>W,Y1{mA=W+L*ss, ijFr7H P~9qi~y~֯~9:\,  [ q1 O~ޅOu /.U$bi PQq)1#ZZN Ȼ5L)nzi}$A'}H?6"cȺR.'ߤ/G-[Oe)oy3 ";CH_'g{xF'Wm&mోZ`C24z㸨׃Od2A~?F$˕ /YqKgwߤw.4q]p,0|>˹ݼ~z.의w?fwW']ILٻ,;\ V*mAL#rMlc8lRTH}f2{!ٶzShMD6Ĉ$Ho A1>3~:7 q"$qoàRwfWX.1Fs$Ѧ2I5dZpfN 34*vv~L]4t_Я"PkuX;V2SnklTШxt-ch+O\9s;cM^8 *gfM}%Iltah?jw=H-t9{uˈ2d -k i4@[q{>ڼ16R <7#esTȵS}:14*#!8z,Od1zKbA 5\~rsbⵚRru3?r @YzFŵRS:8ħv8 &|LS 3n@W1zxcfQiaF%b 6K zѧ/B)7aqZYguT t[pTZ+@|5r0Q 1++O!]ܛل!(v8p фt)>h;{崼4<G?Or*jVPϧ9?y (FgV +ir?G\M918qiҧ7 PtKq8AS,-yʏ'sghT|߭8&U١vRcuu~\]y(|z2^$XipRU&##V"e4Na/w@;Fw/ѿOd!4@Jrxl5ǂBj1oFIR.6?FJEq- r$O3ƷEGTؿ͇) @&0Cf(g,;@oלU\ʯKC@m7cy㿿/#(i{3_yehtASW]@i32$1U&k1:6~ Qvd~[? έI- I yϺPN^ :ׅ2;(P[+/a!:)sR~q%z fhoI#$jx-"5(9Nr ΤߓV?[Z@ 룼?-;@A5IB @]bJGV@GAwwA E3 l҂ԮrP{SXh=a3($(g@%,WjNb4!+d0 Z߭M)$6"F !6Gu@uFuBh^K^Gi"g`XF*Ч@@^5 j^*/U˜veeܼ> B&8z⟦䊔sX@(@Ej& }mN,yk6vZ+s!)9 ~_k{zz}]_FTQ & I7Sp_1 C$%)A5Hq7?;|dUo!O%9x uR ~jK1} vO} Q0A9afwJ6f8_ V蓚;Y0T^!j֐]}YP8 ?DxJghT80wmٍj2?q#ɿ" =n66,nqn[4M[,)$m"I%*J"śKTEVu׫PuzbtC&)6pLϑۨ)#'8@jHQM@:iᒪ&{pzauM<#ؐZGsjÈ]6 ITeGj\&\H?I|~51Ƅ'f ԭb-bТ܅1X8%SmTѭf?vJL_k.S0Nukz5gtL`js|4x}"G#u#f#,T*b}ơ#ܼSX}:X{#Zh2$P[IL x4NҔQa!KCAW"% `VDtШϠ8gP(sY=\dHy+N&eiNSǾrPu`)#i3`D+MhJfʲWgFE(*E("j$nf~&o@Z<ӣ\,AmpNdO9xt[tnE4=>6z9jU<n¶$B;(Ԉ%T~,6W_ǀ]1@uKY: EZ=`]=8@:G41`p_b ]eIo[#׌QVтn} RhJf폞RusvyC-8 , JXE⌚S{rQF7:v-_үӭ8*Ѣŧ֪feC4sNܽH#.T>ј,?D6HM4:!dD5!Ay8J%"-_sG$h庪h0_}5+~.wxea3QdFӦex~z-yl~\-=e?CXp. &prfAՀҘZ`f,+a=}fD&*/E|Y?b{DVQc&2$IleJ!S'm`b*uA?G~&#:0=*A}zᒡ*1:hLBP=pL"ƳUBActOOPwú] dzQaO}"u j1BK i"nax6]lk4{e^!r?]cӸ,VwslRr+ g!֌vfS<jJ&{A`d64 Ђ$ YZ *>*ƀrN+@t+CE0MQZMz?*kYhHkpI$IS: B# y?XD,OlMºIأ]*x肘D^π@ߜR2E }i1tJ{_x6,yZV7o=KJѱ 'Ki.웜@X@H'|UDK¢rIlbFh ?c ZTE'b)ySP[~NO">% `SaڛY;DD1OTtJh9 ~Y PW>N JYXlEPeWUʗEa6ڷt'JCLP%msH\,(C(:h| vtDc_ 9%!/_P{sH~Vшc 1-ljMIXa)ܕe`WAo;#;cǸ;w$/z:6&K!DGvr&D\EeB fue~ӂL5ӡH_wy |&<>-֮nS&)11~ؔ'#*'[I ʚ2~6kψ,Iʊ !֥lʱ ,LnqnAp:QGGegH! &~2mAdZ̓yoiG>0;;N̈́:iHc}wp`y*Ǘb ՜$0p-8-GL\.Y5|aD?hրb:h ̜ nx2Lmh ~ZϮ rEn` ޥmZ!58$u1>kn_{mCrAPyN>yg.KI3[FPk4fN(F?=M9-2FS0[kP2+ : tC3#Pc%8Cd1fT2QZFe;h .HP.ɨD],>-sZuDiH(?T;ѭ,jce@HS}iRT[sDӮ#k Ijy9fܰ̑L4ӷnט>0X͞" `ɡ^eҚ) 򓧹-n?+;L⛅{t̺;{ U|\w+3㏫-̗.+*_] %®?( _O _7n}ɋ!˿8ܡa,;zo>E?#vXqwh2E-jNoO nXq;{wYp,oKMi+|X^M;gqKX IZ _8j>-+Ii8{zꃮ I6 MVѾ #z_Ib#^%Ϋc`Wق~TK"e9˝BK޺0z;hWhԘ2t2)C:b8N1 x9Y3QD6`0BO0l`aFd͖y^Cm_ %X2K1zN2@sT(3'Z 7 (;w< Sb MEy.޽6  KNWЄ*B>&LP:;ɧ"#eøʣǫUUudZzAm1]k}wb:vYSq&yw၊<4-W{y>s<z_z fZg_{W@QffIb' {5wiDFw *VD3xHlF0׃+T6^>nX`Ĕ QJDjaAB$4Qq$A3 z+B1z+D*t|PF8qi(WRq42*hM@KMc%Z=5j=ttۍ^ B6 ~=IlFP[SXY4T[“h;aa'EP`ڕj^h!~}λV)y[Vg%nCSX汉6;"+y40oeow۝g[2\>[[ o.o3A|' ~}诓鐊ٿڧ{@q&(}Asۮ5s\XV2{7 MjN}Y4޹bU3ng6?l6ώty`6/y1[jΝˀG^!Q\49Vyvc'Z+i'0\"iXnÍ~tW4J1TDiM԰C#YbPIެI fXxeӝ|nHC$ 3W'#p#,gZq%5Ou84EOfVR ]kWqX[F<Ì=/Mg+?7p=݉m|k <Ŕ$: zy^HcݺGMd:9`d 4.+~85$J}ږ.Kɻq#W il`;$.Ave J{muKemٮn۔3AƬU"K,XEV=0Śz== \ U,DԵ":^Iu廰Tң.W% [sL@'UV"(H4jR6^-ZmM 6tCU5HTIO(} z9-њ:JG>*4զx觎n1wt4bTQ*gw qyNs軽MLtH$2PP)wM( ė2_t4/x@)yLGLͯ俊:z|?C8 X 3 l|`2X&R^؆iN1ėy&}PћKw`QK/q^. RɈ)cFD90Y>7_Yo ?'Wl Kuqʋ߫C3^@:3Q5:_aZo30ė)*#f),"x|(fFL- Fn:DW!FۙCFE7福6'k@!aJx[V'u$SKHsۀ6MygLV5utAq|MčN1Kc7f3vŇd`gcjq6Wzd`#$F×E Bzbw {M]"5s_iPWm6 +$ Pz:{;ȣg#f퍂{o<a$QB>YTB)|* loRM4D?CS*%Q4q sPd uC:; fnۮAZ~֝Tfft/5F1 byN5 xYց,R+MƹEF>v>;DS5w; YJ .檎s+a@{VǗ130Gă~]w˘KiVrN6Ȇ?,xByQPC7O7̈́MeT*p38/7z{7<ޚJwJC>s0ag['<${BgQ UPԡo/qU\DIӲE#P7t`$V>s-+ |gOA㸙'./qףC!|M"Fyl^<$+w85ul qcT1#NP2^!i(3Va~TW)!+nrP2ԙ,FrֿHuӆi+6X_-g "g3kog/ъz%QLkB,`+/O!/O 32ljZ|/o V+ BZ|\rfIO6+ 8y9g˫ƒ/omWfMcC(}]8ɫl:\wWYyy9~.[n[708":^Q_o_~TE=xŲefРN|Iஇm{ɀ[?NW|nY_Sna8~3?y6ۿT5˫B\ۣu ~ zyJ`)syB)ƅP.P*MB2]BK2N32k ':縉ݚ |?jʑI;h\aV- ;w6(!^LM.a~kk;? ޻v?Wr>zDB|g4Wj]^mnU@lnzg??=ܿΦ/ oXNߺßwKZ5?~M,ôyø~غ}Ϊm7]j7X L  MMo3ՅϯM0FQ!oI¤f)=1{eh,WW'o(]D+h%||243*s8?jpO*vC k^ún9/z~Yan┗  HP`U&YЖQ,>4F1 ^HBͅ'ÞS$yR^bWњj}0ʷ%fPmAd1T7{曧`6A[rAA=sS"ĥԜ餐JӢP-|0uǤ _'Ї މA^FNi -YEʏu "ĢOfoobmuȭK0snyѩw'nj;gV[P"l ©3.eBeZa8IQXO> 9aхu6oKY3wk\_ .!El qiDdJDyF\a T9/v'9x⊺@ `HR#FR6,jDk Dl !QU nV ()3fn=4?EdV⋎)F\Geu%w%th`u;P`9\kZs۔6 C9!dQf>[l qZfy|;_~B*A#J` $9n >7Fx,U%k߇boIMKnBJ:@ڞ_!^{"3adt -7,dYBk<4}Uhr:ׅSiFZe;"8CŲ4%H0ΉM$ GVr݅Z|M-D|4V [I` d11;4ma\E ٌx J Rbh{vLw`tPU9ǦApҎ0=ѳtz 40[dp{GE$jfΫ2/f߻kr)^~ał.AgE8B'Y%¸v }k?OE.f_|88;?Mwnr۠ťI| +LT2J}VP}Z벳| o|][a` jLY׊9jnw=agkP"7G??UoFu׫'+TtTb GʏٲF9Lh+V7]eI0-WoF0G_揋5E=0(ss6%ݗp FN[H@('3 7FhJRrB ꌦr}G M=,әNpZWLj$IDĬ"wRMI$eN)2-z>g7nM@d#k!1Xfc$/QT{s9P-e$lAOZXDT{,oq!^㮇! XeMI'TV6D[$dtZqAS&-o! P j/O}X K\ bitX:?JJR$y(/"#T߾޹Q_SQS7WګW 1Ǩ0;+|J 1QyQ:p~5d?C,/KUT;>8f*y Z~6C|" MCvṀPGL#sbm⡥hb5hF;9و+ ʇTHpnR+ ƸK Ć CӼ`%L%iҒ4W$.u&Ϲy>Ĵ؉F_>d" u,RgZ,TߤdܓO8a$j":ר,' o*짎~7 /Ӻ 4\ J"$UlI.\= T@ :_*AGl *0l,D6wA& fЄcۢjsP[P~!dq2YLfyAtEfINtRh)ZDmH-xڮjŧVZ[^Mn8Q3%D)NdA LJ9˳]37 VQ7xd< 1bc]ɘ:&Mgz8q*M!by ܏^vQ7øLK5ejOyԘ"W…CDVDhŴhŶS9C+Ϛz%MLkiʳfcygd%eB("gHjE" qC{i2LeiP"fMC6~M#nMM )D lS2/eM]l -8 ~yw `8:]-soVStwM8<& t΅s*#!"CHHe-r~[szkH _{$!FۙhN.fHk PJ~`1 tB8vEbhV0"#`䧨2*P0U)˅tnnFjaAJYDӆsS/IE&hFR(1RAWG6>ZC| Q)PaK>?}e3?ݻ5};- ^{'peYS M=w~>#J(0I*[xh>Ll~T 񴞕0'; 6yXXe{("~p-Fko> d9SK/m[H!h~F!,I fS:OI.urXa/$st^x ׎0J -z>j٦UOxOuJԞՀ羄jsğ;4^ 2UXAEo=䱽v0zmgi)[9A9: ?(r)L01^%4͉&$Q8-ΕSI^HȮ<60 ^L7mG8Olh: 'iH*!VIh6q< c6@w5mX_w( ? {Ew -pmklղFIqaɒDQԱG#>.G& IΖtwDTkzl9-+|T+U K(X3cG,rS_hHO_@JhhӜD3Y`qŽK3=d* ^qZ G 1`tJL^ 63Jqiee)d|# ^HMxSb H+SRώM':6圞6{6YC͈[2BK"LTBzpb/ޝ/=̟{OmptG3$)1 H_(J~fhO"pDI F@E,l9(^_应R2:uVj +\ZbpaGȣpCzs#rF8`]X'6crFRYc ]J?j81'"=Ou"z؛mRAڕ'Lŭr waBf=1.D֬$ Sn%)N(6}&O pȳ'q]s'9<kCǸ8o?1_d|hguswBp"A\vW=`k2Yv%)" Pdn5 n9+yn,"j/f$C1hLgY?ٖm4 }Lb.Q$IA|< {;WH=|4NAu'}UJ>9&LHWf{-jMe bĖ:&2(!,u콬mH<ә%PZX&6:'_aDձKS7 o0~ E&˜B%":H9UXesvN 92Q5y?F| #H9$yǕM; E4z]Ph>tE ?] QXHq43uMЅ0F&@Ck맠lrh$fN"SK/i9te>ʍ&8 `4ɧYRA~xh$eʣ34{Q.K  0s&;c!:tFbc{ 3qEY(35ԅykԵR%п42GPs;nѬ͸O Ww#=S" )A0# Ap uoAi49y.7NL ǹ֙@K$P,7 s*O"1k"\lTlHWp(zFn5 :R#303^R\fX} hY"G.#-@R0om YUlޠzb$p nOG:ې42F8[]"pXS] Gh0Dk_%$NC3.|Gr¹qS3eSK~]<bTߍW|CYBX!n4 Fw 0AC1Sj8a)yyFZ,>sDn+]XK~ulQi(uYH̜5)fr3'9LYdžg%GIuŲƵDH0sԶz=YHk3ۺRL;I *(pqWDninge7zsfm촞!tt)tUM \V!^O̗i NP3CmqCkPP6а}"baah-êT2+D&fBy;ۀ+7M̄`BU3x|gz:CB/H̜AĒf^@bQn]^!fCxBQF Kcɞr\'nm? H 遮H̜e9#]#œ,y‹nf9}KACp-q8 Ow$[,9pbfsӪf=R*a[(DuL%E&'Ea}:=6ܒFb,YAO'>GCcB/ o{ <@#1s"[6DZK!)Q09]IO3s aJ1|> a& | UtZR!95Ts˲1XG/\i.H`!C7=eH2!OPuXt_Mڦ @YUo<` wΡwQ9AEq sDA;`9$p즦85]z6#EKK&M#x43F(Q4]2ڣ|2A y. bY{#S*`=@#1s"\=m䇭K*y$DH)b9c` L\۱xOUF~٨q/|Ѭ͋ΪP-Oys[}~9k }pWA^UFM>t=5i?r0-ժ]]^/ru}U$MSy5s0b 4ZQf5`# ``qGq97j7=ՒtQ(rjbO'_b&ͻ[5UWcXgvp^3 r|qw \djZ5@@ eZܜ\]8+Df5+?矻LH H+ed>G}2CEQhƐTDy&* ߾?-?ok[7mV[W#Fx̭ Nُo7~U͂^NA5r+]>|נ_o|K7}[i}v|L e>ە}ǟդq{sH;f~^7O{i_=5t_U횇|q`\O֒Mט.0tw҇mvC]Tfy@jUN٥Z6VL +\qoFKiY&la 1`VRuBu?;|+ćg.[ӷmx:LNk7Zs ~OhZL*ik7Y LvA+ũ_ߧ7~9uTmnB..K/FgbYb>*6>zٞ{i1ŃCc>XBwx`SoֹnN|<~n.Jwvvٳd͕M' ƴ|3qcqzk^.1{lF}`R;e -^Naz tvrT)p|tuU݌؂}oe5nYsyqs~RrR\@jak 5!dԚhKRֿ쑯_6!1ϋdz @ꉃER#2rܓ HReisS3i"=׀r Jq^TB*gRK.eI0+I͘<3Xc)0e5sxV39<gSóͭgkP:f8VOھVbU3;x&w~c<'~tg.U73 glʋW/y3/K_tVǓ:7"SuUBxvf^ӺKߟۍIz'2߈( ˺ˊ2:s6N]v*g;kpb9\@otxEH >ʖnø.| & ggOø.Gohݵ͊jV:v^Ox F߽z9Ze{Z%.tϛqz:""ٸ_}Ouҕk$2&q@b89$À\ z # HIY0c-,f\bi9/r -rFUʮM`e S ȕ3ptӻvz lvyA"{}vh#o>XF7K:+?x9.'EV1yߺۃObEI:S T2ºGK7D$&>ԚJteoU0rL"MDeNsmH6ʭiWz8C4| Q))rDla QBKK5] "3CH>C8LcZ!F[IsYqh}X<@7hߋԒm+@?#H $u`Ȏg Mw0ҫ+AzcQXEhk0 6@fäz8 94[pz4>f|S+ĸ+e aH#1%p8-`ƶdѼ2]f]~$OS ]oD2t.ɿhd'H;_K {}FuQ]M+߶<˴KU([t\Z; Bf˥HX#BDB 0˒B"@FWYTmf 7 9$8r0O,0 u@Zi@rBJEh%ԧDSṭ9첋nLVHubRU"JgY{7+600X&%dv܇=n;ɑdg&AuLIDY$-ʽGZMQtֶ) "za 5&a 5JybRnÊt1 WdzYG9 j){8>1Clgbd X/0 N#XD!EU[(wl@Q d5PA n~;z 敍BL[>X,(t-H) Qt+m έM1lH"]o+ J;@"ޥP MчNXF 暠ibMgiJ{Zqyuvl~rֳ</g]s Cqox/!B d;|+'NN;Y;"g''k$DNV1.Ax(;S2=c̫Fa,C ' Z-sVDkM; 3b,g. ihi.ɵȵF(2P!i0{pP5; zA*?w⣬>.&5O_tWѱҎr,R'd$On(( 0:uȢ``<%PfB ED$9Ž-ٱ܌ula#c,븷A5nO*wEEܩJc^#=`wklmFme]3뜋^ލ6|VY˨C-Fɔb!r < v*kZ\ x>qz@~o.WN;ҭo0uxk ^j D?n+MrAZkȝK,s01˸`mc^;|q"{7 Ьrx]vhE#WMyowxs_6_t7oTu4WUi`-Q2r9ȨaRtCW>T*W+Wbf 3bq&0EΈsH N׸^UɛSw!EYNx_x!R8"?'&jm^QΣW#?򭚽w&~*gytԂ+wZ𛉗zav?G-W#|~qd gUV9 |e xDK<{+Yƙ^ӣ"0tJK{.(.nSe[x /"= o-q_WɛQW_on ZMRd u'H^.)l" U;/Og/nKj+"o?B}%/$2"qdrW%q;n^7$?FϮFQ-ԩL*jT YIU|$\7h˰ԔY 2eCNTBb"Wګd`F(:c@ }k" E41OC41OeG% Ѭ´IK&0CQJ(@eJ` <0 ZI QeJbD&8(rH1%|Yb>Kg,g[x0Hƞԉ=cOhĞ'{{F(a159amb؆)amb؆)a>S0_lrkjoxTJՄP^׉:%S KIdCioLhnEt*̽IXw!2©643r`Bkix_ XJ31N{{tyNZMwׯ~Oſbٲ>]y*y%تA>ZbGrok(jKY0\-.T\X,:Kiwu!;q}r(9ڈ0mTĮ~c@-1[; t'c*#Dx- ܥZ~qm~&gyw4otk]"֗bztU4(6t|2R'3ZMPO)ݱYO@w9JAFZOCGνkak[<x𻌹th4JQwLs&֏[ٿY-P3PAGbQhrwk-] zv( KoF2ǹ 5U;|*`^K:c];;.{Кॠmʽ@ %Zf I`;\D2ZbQ4oQ' ~<Hоz `2X!K-xLR)w4< [W@=yga^m혣YxXhԪp}{AH4j@^|Q\ q9كm1"hou "?sl+⍳Nލ+"mCԮnґߚlQnƎ$*ތǻP譽m&#yE%RNqɃ'cËPbujg Y)5z:o%ѱShf+ +̭U2B oX dJk y o$w;Jb%f\#Z^.9twQf|3uLD@2 W-sS2V fhOa ij4@dmV56h6s8]՝y/.u=#Bڕ(v`5iqyb|^"&3i B &q"[68wp(n +)Qsa+v.!rWCZ(߾_zraLvo"Gdؑ<2`FA.`3F4##i"!"-eց6dC@Բga22-($LofrLƂ!DƼL(m5 媕!cΈǤ0Nj)P:!6 I bec+̛p8$Y&m ZFN9KN4ua$Oz~!ݖIޗ 3qbWڃwmOA> d@%L6hoAmi8"cK iASE2R$N:J8fDƴ趽NDmN܁)cETGKd8FȠR95Gq쭽Nhou;;7β .0b1w[sZv^kWQbAQǦrl} !" tf+ea:JL*jT @e%5j>y!E=iOIp6v2r0ˌ!:`A@J"511r@C!%d9Eʢ|ռ ìevjֲ?ԛEH5I'\/UQy_r}[3wzzK2p?8; boID~I^sOy-2Gea /cf  I)r_ܣW7ɷj޹?;-ݠxr6+Y}GBjqtk>g|e `0ׂU25r^?rWetUb/K? ZW3'61Um6^y /Fܽr")㾮7ţׯru_o' ZMw~:O~]Rh"8/m6cE1<.</6-t# 1j=rdƳ\{~z%"9}b VlJ& j%&gP,4r.zyz :@7littܷ_>O)Rpf\טE0aTtvZ7s=f˳BodRk;O?LAҁc)(Y*qi\*g2DyW=8l_C0v,Ja! FH - :-'i) ` f$Ր:5yְ~Ѿz{;Ւ*T=] o*hB2x~~s g2Sc'88t-v~-&ywM`U"}*Pz?o5ͻ.Glr?쫑2rgøvyns_#Fo!csPNj><ͯ@O^ G=hK59y ;6w9xۯN>&Rӭvt9Z}PZsM3w6&r]];d#(yԚ)=y/L]dQ 6Rh.vԺEn^I;jɒ~9t[j@d7i0|FVLW()4NP\R!Ql6oB+u ; +e.Rܲz?W:d#lc8 k췛yֶ}xa}Zct2ږeՁ D{撇q0p+ePl;@QY, mS KIdC4B[i q26uoMVPe!h"y֡}Ќ&3VUl}AI]Ҳb7b!cԔr53TEi N{4sFqˢqAsM@5 qE8۬%hAcR(ݼ`o&44ޝ];Ð#`wؤynmA)]}ޮ6q/զ\I#lCV ߐ K w1?r((ó\|&G?GeBEfOoː9&y *_'_U ,$qŔzX?'pEMq}%3G_O~/r*T8AlL+)Qׂ›}Erl,| -;]By%ΓKP*1/ tJ 5BlLk9Š#Q۫y7Cb?^wS<`^WrO||TXmdH{i2V;ăIGA_l6npKb8 MfυƉQ||Җy]5^ϖG-KЁԐh񴔀xz=;yv}:nSǧHgҔY8y=7p|wyAv膳}-jt]LY!m>jOkhavT|]^?L4a{3>f:[hwg8asb44y+(lﭞ׍w^^s>sazʷkC` ' ,:?qa^4N:x8fӜ"ygf4{!:^g@&ޯ85O3)Z/fQD:}')8$q`A3[)f^aڧ*9TsZc)&Mm`!}>K,[t֧E}u,-Ɬxvk<1^uUICmaU&T"@424aRdžSF% ܫ:O+]|IOOJzeWvj_9t+;+Z>־2}+?OGWfUژbwTcۈ8 nU Ikw>g_>452J`CeASĆ`s`^S{Zɡ@ڳuK4.hjHДp>MrX! Lc&ĨwTS-55qǽSFXmE[P[Z_BY+vt9^J2|7ST JECbACMݦ'pdR6S:=MX?FofI]ҴT0%%:28JBi J\J*A UfoS#h}`BYY!{4[BU+wF¦v4jzǿ.h"D3I8VRbtĀNԖWR44lk_2pUEER BQּË 5|^^Yzz^Oڢ;B8ozWI0eP+N~d?`A5< ÃiA\mz,'^QUQ<\^jN g.ÏR:f4PhY:m3v6.vC֧Y12@ڹiWF~da*盞o%3*eX*| ~?.k('5ez:7⭾Z}" XJtu V*7Ð#ȏC4h[P4K=wM~f0M~91&1!Ā.Ng!$/"h'ba˸ |'a%IKXr6%۝wQ(XݥO@Bgoi4_ߟxۤ寺4mYBK48nX2.lÅ_tefM]tKou-t[NB=؏%nz=`:SLH2//,u?it0$O1-. FQ/E!3NYHX ]-5w^Hjqڞ1rݔ-zyɦ3?ճ 0lZ"`)cwːw!v}8v #Zt8TCw:li|:8$]=1K( ڃ?rxg0m}u],mA>^OBzϣiJY: eG:ɀAMT~ߔ< 8K\<Dh׳\L\D%^.-m!SwLI1MiIw~[ʕf_ˠD'_۠v۵}շ㘒J0[)f^aYgd9RjiMGiGuƼgOÝj wXe[wT"ێؾҎn{N#,v:Z`Zvm\h-.d-;C6<}xM֮;f~"gƆL+LhFj?u: r0{aG&L|12Qꛌ1R!B^{B;lOgk_ІkwhcG3peh!,UXN. ˑH )vEHr֍ʹ?@wGsU:u{:cFN2i}J@igX acgͣwL /wT@]owwyʀ) 'o'`(DڏQM1CB`|>y=Lӡ£dvs҄qr[~=y1x3/3t ~@yVRu22is3o͌'fFYNg;YBE1vH  !x峆w2YM7ogф́˳2K)[g+6'7뺈^珲gi<͛r8.(U`8/gy],Cl#hIHU1s5UCk(yo3i`x &̰ꌸ ӥ+*V.,+J/!q%$C{4cg&W~6!}t o8yl ~&cߪv8T@ߢuZЇOYx"2;CL~6 _l%"_K]P]PU0U'1 t­`cMBJ>It4b0J-Ku4OX"p츼ԁ.<xDZP65O DE6f|=h~N\Lquл0\!I,0'R6Ϣt /J2W͡-z~/I%I$ahatE eka^v4I;W0a!qvzC*̖.,7(@U0aLW%ѮlqU%EqF?~q&,hgUfV]OWa>qEj%N mvJQ#}_/cY3;eْc%UlRdbJI&-;L$'RI0y쩒o^cJ* ˥`ٵ|9uz዆GA[`w\ʝfb:^s^ݪ0pJYBYOJˑ{w.'1tG/5?Z2yJrYV[\~:Ps>bPz몏]Եjw_%Uu&D ,ƒט ؓ&R"djT 'YQ$gM%w$@Fkgyb5ՋX5gr=ADdߖvGNbC3>Ml(ޏHS ,p3:Ri9>/ɧqa̽(oQ0mj gFjluQTjTy0ilWh^ۑ:6+upG/Fjo~6LophK]DA]>_!&!L7~] +V Javs[~_7:]8DfoF:E$^T `c5mZz31"Bc*|Lz,fuTԔؤ q" ʜK5U10PHP`&pDaf8`/)N8Ra` 00B UIz9mD\| h);e1(%K}` )o4mh(Eq Gj?#BăaO [FOwa(K)KUCڀꔤ"I+mথ`& pA|I20A2#dHI|>dKlˀeX>wN (vJNgD)gҸҤ(θEvݛp7!PҲ{x3_H(0s(!i2a͡iD/6o:^rnԤTmb+ƓѧF<-G #z!U H>DS~ ` pH~q@WCеԂд x#Z@澚fg; pj3 w;q1VYEiG6ʌ 2lr`PIwPB7ژ3DEcql{NפX+,& WUVlxgetjs)%\fc-<'pNƖ#6UH7f5XsT,[D^G42@#l\yo]R^(T).qR r:b6Z$Q*l4jI<3<6RRDPoR18r6G&gpXF!ƅD$^ x;~gRK}j?QAB 6EqCV#[ʲʅcZ;PYwJQyUbL%B𳛿gZ/>S(bƬ6̙K|ҟ//f,$ zo;ciD(g5-2":bV|i<7]R$%O۳<)5R1G٫7l. }Ңϔ]펮kOܾfqJ{uhՙhitؤSjzÀ.dASj{tG6ۓ>N9ng=/-z^j]R gn#M;q/V.}%ǟ5ݽfS²tjL iY,Mpo" {opɑȿlTgFN`<[_5u y{aeͫd) },f$˻v[nygݮ8 Rjr0H"yZPNd,uY<3y:f}$Zsk_)nٵpPGVJJ@7ܼSs_Kq!ǝp~'XHXJwv}`F / Ldx$W]<9@A,Yydc"k/%%\_XS"h1g;Ja8Pp&`/* FEz 8jVp?wEk5w߽5vg>j>s) 7+refTUW" X?zG@pht4(̃TjBqXtߊ>S"` .,2]T`B@7ta`%\(wp)BSϘ8P=t:F@z4‚Sg͊sJ{OZR >U?tlf_? zږ] ޑ~Ǚ@{ɶ8G(C~@WFw|Yڳ0KܛY@|9a קiWvmMh^΂@2|ѼGL7}BrɧS[?nh+kn%N5er IMp[DQ;7qfAo$*Ű[)rrVp3)K˝3fi^17DNV hSa!!*T2;“?^N%'f[JF(߅R>RL*Gm7 ,3~.<]-k!eK-XOSWWnAaџ>ۀPE#&o 'H !"{i:9YB,Xֱ 10BB23uS/IN Y$5ϺбsִiYigaP)c"A}fԉső-FaaTL]*f9qF4ZOQ1R̨2*bN26a69'$,d$. %"{-%]N&Nqb0IjRǟ]LNN?4YŗgqX|D#uɅ v6שvAo3-I+$䖜`"fɖezszTvo 擄[~s[D5I5lK-v'|bht@fqXNc4I*<~+YÅg"Jēytg`w7Wٟ.:hQE_O7g}l_wL&e{Z]ox>f*z72<Ry'2ɡJ2C~[L&M~ȅyyLbfꇬ!S? љ!S?dL2C~!S?d2!S?tJ0/YREb &DPK0R1vAPG1 1Z\n%ӖkːX1FWuT ϑ?Œؽmr&mպYXuoi"ud\jR{.晨#udLԑ:2QG&D򀞔rj8  U`ha)1H-h3-mT|fɃ 5= Ҷ\  B;m g&A&%0XrVF57j !du8mxInv LpȬVYfnkr wo d3vm*A]K4PLgSO$\dh/G- A@Yy;Q]޻iʽv &oEQ bmo(B傈D-MoLA$ n4 4:S A7 ܻ7>~ w^fC'0ӇIo wVxrg Fs )E92(,}KԳ g}2y }]SpL Ån`%=IC릫!Us*|=ˣ½EMr8`6|Z0zG/;Fj0kBSL}_9%U9:E4KkڕֺUij+EOGnA.q\ndY`0, 1Z w?D.|m|.'6l|~)<ŷ?X}OSZKeлO!B*FqIΫNhed)KHzJyk\^ e]`DCG*F 6rݕ7񒷙f]"Pf@a1rET6 vHPk`k.|!gh՞:#,Ra4\g-nĝyQnZmݑ<ѲCD+tC`Τ.6FU SL^CWo]%yY%M,YR|9a ק37?_oքt+룶Z/cp x$ )[X1c0^ˈchnDtvl?kKB`D 39I`b 5 1tt[szimu`HjXRD< aR ˽)qmug٭fRL.wөt&'Nf2M`RXIaaؽ~q>́t 5K;7P.eurUZ5JZ{`xƣ J"J|2J5_">8@1SЉ‡\J;fLUP^ J#bAD)N:p]Ԧ2So<)FSF*DEC+֩ZcA!@iY!DT}=yauoJ5Ho |[B3_"(Nˀ1)."s^>q>J*Cؑm!wXԓ!IwP>lf@""cdH`0 C3xXj:ܱ=ܱ}wl#҆0}c̙\ jS5IzBuOj1PuvQ >K˚P%ufԙ"Ԋ#~Gǟ/N.{;̖$t] z)e^b ܬ>tJm|lyL5*tCCcպ<&%s9AuNu'S9h d Xh Ixbek L2&:n< v`nn>u"-1bis\q,_(Do(rXSh;DrTG 7☱!redQޘ}1WM[m_jZ=\Y@o}:=ȡۏ.,<9YnT9ni [ʜbzVl/f{(T"5!![|&`Oт~%yI Z_c w~:x3o@ votNiivY)=ۣga^{ԂN(U`$$*F1]'OzV¬%A:-^<ʧ7*KQlh!k-@R0K}O~ .H8^.)!hraJFOΩKMic~Sȃ5]q =3G ;!%v0_$cذ xt5-TăVzk\ؔLc%tOm@(mbs%f6YYvS$=!VWv˜AY#^(FO:PIj(%8R#'H71 8'T.dԼYtEU?ABzy"騕(2pi)yQs$*YQFO<Ϯe@EE^lav06S$ toh6;MqXd%$&m&XȜ" xusOQ#q b,􈫛"x,v4Bf$=1~ 63sjJ6V/uhUDaIRuJ+KAl]AX%dUM^i2S#'HtXɲ0/ "<@{.HޠMY88&w5U$kS$o%!8RyzI?AB|:B+S>[dTzMݭ J߂ZTւ&Cp)z{HkElesUZXb,!5jHyļ UOlɘ8~.,+H"S1U(-JH6ˈ"ƛQ/*]~ Q8t}%x%XC[zA٪Gs2ʑF abe 旪p 8?;+vֲo|Vi(@,IipS8H:F2=EBZkdZ٦x+ # ڎ"759b>_DN J^#|x*zgLlC5- x ;x|Y"xcO%xQXO㘜DPPdO<͊H m+sy\1l϶"x'# J^0Vߪ8UhBS$ewJH'JNq+B˜s ) 7 AQLaj%S$6zAZvխ`!%kS1$t#pYRD&(&&7N< mȄ9(m^*VྒN A/y;imN_I, 6ShUO1'M$#S*C6AF~H9cP5@cޏ %Y 2MŚ U ti& R[N J¹SrTCaNӐ`O7tvzyR8*$b( OAcȐz*L$LTk@R͔ ?{Rje ?^ڬܥ <~v9  R;iRNc F ݋o-}|=+<.I\U?Y'uU \m*55`/NgaY jx9n.JB]5۝;)+N49UpGNkߘJ~ttȃ-̱}jjߚ:[wښb[g__mK߫ur[ϙr[,|pp邟RW_"Yc5uC.Pe[%|`%=ԋ=Da_xA³VY m񩷉6=HKrUVgw\zˇ}*^?CZA] /[zZaA**d!B%B],^Nygd"Cq*(~7RVOj 4O8|Su )gt {@(;a5A:$ӄQ5iUYɐT68m"PsU4lj'\._;UOoE玻.?MN蠢 AtVR[% \ ,&d0yLE.ӾaH߃1xE[,vXo1/cЯ\<΁Δ2N@mlmX'>lqΐ/Iqzin1ӵ:{$ Uut͇͹76=ߥkjI Zl5b;>}l[J݆ɫoPaxf[v<sheYەM9vvkޕ6,"\\";}6rUs&5$ݑdQ⾈ɥt1O/i+r$[۬fW5:]ui c7>'*Fi&ruPH`TMnpR9&elsI9ɧܲ 1e9w5 e }[ZinLBȈ6 &L1u6Z-<2xAuκ XD 0bA"i;v[7*7t"8w,a4M.w0sS=YZc Vvml3&D7S'ao)ii׷;ur2Mq lyRjoنYߕڢVRBE'C┡^{ǂAV R1_2geRb i{m!/l yg6y[6g7Mv?v2x4Ɠ`<JC31βERy NBP0Cʓ$;2WDLځĉJ , wZPpi <źsCn/YdKd B!Bx֖/+*>ƌHHNM򠀥 $eTG"@q(Tc[6***ΉN# x=YS7v[ʎ݄t7ZrVV4+怬stFriDsٙl2r=K+ͭDRjZƒr)[xj!k'M{|M~c'$ze8AaZoÇ/94 ?_0[_|&Kp7n&̦|y AJu!&N & &Sn:,h5T"=-)}u:Ov}Xw}s-Yg{Ю#[̂-Iܜd -8x7W|#_k22-3pǴu5Hw .S Ȭ0W"-2*(. 5]^vL[TPUdWO\jisāy-a2AxZ[r+Kmm(aey]`y_ϕ;cq{D;Y8l01sx3ہgnv~ޓ0oĜ1qgط/;h7>h)Ku;k>,~y\P> IOۮ%(1Jty]2T)JbJ%Q8 (I5}Fqt+*F+M7o{D>")BdЕN1")joIj}DU\﬐5dlrhAG5VŽR%KwI2KgDWBrEsQ+}fzfS".{1넄K+_+h% A:hlɧ.Mң;׶tֳ R+QL.gkzaSצȋc'`/6To*oQ GȑF1&:p5rco5r`Hț"M4x.K#Ȃ SsxQDlbkF)W.x뜡mY"hrTɃ5G 5Xѳz`Yق-RkQ*mP7,֝&5+݄91­wR2Y n@.,Θ+.\$seUD5W\ ‚+X^dߺRjV'4W=|;'ӯnl JP>YD<΢?Xp@߮Uk ta2u8Ə']Oy A& >_xݒ;kQ`ɺdqvfi,&~B_V]Z@bR*+DWUvܨ_Xէ1WivQ8jP`.Oլ+ z*Kg4WT2W(0c3* 1WYZ#n\|a󽚫+qW/*%gpresup,U:8)-s%0WwZrfW@?_ݿ?ǿUm0V:w/Dʐߝ6iՄ$% b ~5өTovb ˝1NK1.3:a1L)SX+ Zz04PRE6&I1XFNB S("S6j/ՄL0%\έq@FK]"rPN_)",Ac4gFIC%#Q(fUkDcoĽ,~ْ_IS`d9-ZcÌ$O?FANu)}.Q* Wt5"ҀG4hy8)-Z%iTh1sQoE‡ |`Z"-PM&) #iU鿎6Ѷr}KɈ&+Fӫq^)\~gmKץǿjbc9f}SDYE$&H$&&u&{o Hmd^ "mR~j߬CRm;Xұ 6uXySgoa?#iG٥6jlpɃ0Cx"hai)i5竗89%S-b[)+N";Ww%PF ,LFKWddiR2Z3Z>aF Hr{v2O.%zgDk3}1boߞr&e^5px}jXsEdi lZݵC̘Kز(y}:co*j|ѧdAsCd0^U=c$it!I,/ۦMjBHΑ(S'zPbƠSbZO3$4R-c1q[,*f¶W[xV[n۶nرk2m ͓?~O>>5'i pglr *a*10Ȩd1ta]L a!]L;kՆj+-az ^Q Hdi/ӡ %x7*kKQ-6wzIbblcFSB NM򠀥 $eTǰVQ9ƶlTUUiF@Q |'6:xWJͭNn%#/6*tHSs)Qs>i{;o;!%) }4OM [#g}4|lb~j&_I8MLT'&0ǃ@$SB 9dHfEy:鉶6S,;i4aOvQi2(c)}SGYg{Ю}n0E,زJ+@Vْw-#_k22-L3pǴUOHw .S Ȭ0W"-2*(. 5]^vL[TPUdWO\jisāy-a2AxZ[r+Kmm[dCsC:|՞fiWjEhqF8cbƨ_,B>wHԺ=\sO:WQ{ >xfԻyϖ jGW=IC10OY)㙚Ybdv}}#֝TiYg!Jʾ˾7vZ8)6 pyEL\c;FN1oKFJ4CIL" 8a%+DY}h>Nng1zᒲmusH!ޯ^T+3.:ᕑ GHڎ9$I]˺C}yf}CUsiU>I2R^Id鱸)⣱-p9Z3-vYGN;n(O.*E\u.1a=36Iē\x=qh})J* Z1h^bp<֜ߓ{"eKu߀*=NŁ^Ϯ1$!j_~/;+dp(p1TGlgIYh)_{:Ij00JH}s8$mtr+f;=19 # F fTar͕[|r9m.8ڰG:lEBSAL&h@"bJerhZν$ZRZD#XKf ag׫! &ѷ`/`(3'2 #ƘCHdlH(}<\UCM~Vߺ*u #8kքx]%8߼3&H e0"Nz;RTmunof)[m5 OdjZ \Һ?l|,c4 -cDkA< tI^K{sRE=epGZ'l NjIS<XG˱vbrJk{CiJVcGvУ>mOhUy`Ɍo^^^DjrX2"^T<.2XT\kf8`oH`Y#!Dc[V[z $N@Y`-8 t2W.mEŀvE,OdY.oY6!ꓻ,GB0>j䈷`9H 1qW-j9V\-G1،;)Dg2 M4ؐ#L3iŢH^ru(Ej18N :gH~-MDJ%@ @!F8qg1rzZm%7`5/Rl[ƪ3Vvtj]@^2uf ԝNϮ )cIzrkh]0SSA;gIι]v>qle˞ZiwfvJ?b϶[F i9C}SU7_~~VmH6=>{~ O6ω0`/NY 0|2e5(j,X ,ĝXQЉNY }68XpOٰ/i^ѫ6`7_3*N~hNRjg#:h2` }Ŕc? ^JA޺ZSNMF(,ӱ a{-RL+ O{lΠDEUKui9i8O nY[>4}u?cAzZՂ܉(~ԫ%l\eG}5 rDռoܪ «'ٝx:}v]͗=Gkl9+J5?bo?>+pv9Ch#dlg?\ ~Zx,??W?G;x)׶~o1B-)˽߯'f+NW35EvuG9 LL5 h+]GCP~Y^" 5ݩDk+gBKꣁ:yЛ|{;߇6b]_0l)./˩ߕW!ioQTn0x-q6Aߵ{^gRy{#l˛  ]rD,'`jEa ٙSFف'M>O}DBH.9瞣~J ټs0"{%8`^X 5[[%UrC عع}V6}f4|]XUZoPNt(H'g <셛y , &XX ENr07*FhR/Rk>ry1cGVÏ f7r)Rє\Z͈3TÌهHNxBQAӣJf>ٷ,U^9pq@E:p;viODi;cEG'BG<^PLn.~JknP[~$0|qunn]' +^<#ӯv,K׮O?qʤՂ3x؂=ʈx))jVgȾuJY&9w Yއ,vEJT>Rb呔O3)ݮ|v:yzސڴOT^Qh&Xʋ9 *~)GZ-(4Sid= }GB.r erMr詊#$<~}lXY̎|[Yrގn >.n::o 4 L,Ig~gˤa#\Xce`v7ZjFy⥵`~) VV`ʖX8rʽLLZ::rl[ )j9Z:Nsn.M 8Ngrdj>LZmz߄k+kbj#"5HkbX&JcLM,RTM,R"5HM,RTtԢ##n/-|pxx9  6$I5բ 1k%D'BjIfR<X W9RN 2'^*.Be2j$F%Pbm¦0r,FNϢSg}N/-O -Az|@Ϸk(_}2b%]6{\kf8`oH`4Y#!|%62nu4[m أ%8:EQ%Tc[[qv~.j}%OjYƲTDZޕ^HTG,G)"6c 踫cxlFI"3h Gl L$fӊE%(-Eqy"ь|5ȏ'o3$& "G%Y#T¸9=-Ƕ63c;!rñf~Zc;NfU=:\y'no}6P&6 =U]Zדy')m&r/iL:zgW@1[&hu3-Zs'wY}hfk_v_HT_ܪm~"\hEEeDE\LTT}RL%W5*FE)MZ%ڰvЯJAa,Aqse6ޜiY=a']Jñ|+JS@*qd wowi'kόn:IKvpq㾎u"{srXv|c!E8\qć o8P8R C"9- 0W$iA;,U2HG4ɿ Q 5'҉ڗ(EN7ƛiLUt;s[G1x[D˖b\Ž(3)wykW b93s(1,׌$@?;R4($NZs8ZZt`c)qbD2K6*--b;PY BVeQe;|rÞ׀3oEdGp6S\Gu$D-5Jo8 E*1 & [`̼bJlY0 be4M&I5A%0W% *|L+]Xb#[b8"C6/,yUIHeӂp0^j `Eb D%T{ |]{wՓ>[yJMZUYie)} 6Ьѧ[u_G Kq@dK`(S%vvQ~.rd4~F,7{4l6H6 ZϒA^rDu;hɈO ԬE m0|nݰI5;d #zGиb\˓kj J1*RG#Mi 2&wwGU2o{aOnrw{kX Zt(G /3AYW™1+c17,ʡ;ČU╙`ēX1}W UiQRauL]ԟBڶ<6z(as%Df[쐳"_'=vT-jq3KvDx*wz|J#k& `gƀБ1x O'O9?TIYza 5=4mfݿ?CFiPpI!g.k Cl`0ɍnصe|7|Omy:+gF#<jAٟǚS{rŰN]Ww"4Wh@Ѣ\o}+bER#-CQ$ToшP!K ZA^#+_ND%eIGqVpP$9'hFXa{ժ98ob J8oKIO@gG*Ӌ6>ik,KnIv{}-NIEʤH&#`|< 7?>r#1r R‹ )i2*ZgrQ= 1*2ogݙ螦͵qM/9w- 6[5Plkڬվ9}#2|g rt< ˦[/u%-@ LseCGx)<14\9u5t>owc#

7;.R ǛCs&wN/cd_sֻI@ןRpx}J̓Yg-ep,-:98ZDue^hox!kE}.V8OqY~Xޓ U^+:i\_~_j?ߜ'_xQgs?3s/|6? DgY=V/?fq_xa|7y$5%\u?ݯH?!q#>qTUVcW*N\\-_(P\\\`}ht&r5aWovеO\J%ĕݯh-eĵ9ϛ gesqڞ5sCdF%em,iwz~~z. )$Pgq$`!%b Z$jN "t_GvyI}eB:Oǵ $M(4"9*TpR #8 O uidL-i\jrM\5q&.嚸ߣ8ƣ7=ʨ\!_M\5q&.嚸\krM\˟&.WC&.嚸\krM\˟)ԞPc&XNc&WS{Dj%=~q(n)åh-F|LDcy!^.:{ƕmFq!>bK.е[zW)NyMhP1HL451sm R9(O'cDͫm,˟ZIs$h` DhyhY@,D@IP)sT i #F:*)Q$wAoGp7q*N)}blfx9iQI֫vnb}rGx'J:X➜zBm9]l3O%BT&DPE=Oɂ4`{IZӨC$YE%*#Qj EOkłAV R6*-Y2G)v{e!/, y;gõ iS?N>`:"Qi Q aB1J\;C(BB[l٢'e eɥsb& ʹxK-,Efƣd\.;+Ԇ*+]?_T&!N h!#2VQ x"^"pDjDG:*k blQ͂X$b^( KDQ%b%є(S<(`)eTG#so }aI+UN;zN;iS<߼>a:78.>?m_'q{eLhQa~,-wtsyn ďp_hpڀR(%fIXRǾ1~t{2;Ap" a3_^fYNZ6{q1j2YLfWll,KN3.QĒ1`)kLHڅhbfÁDVHu mm"X,:i>Á[E?L>zp۴MFXsS1.qy $*T9JY7MVAV?9wsQqڼz23ktiz(»M)Sr|ʙpfeʘc˺rh_]*S)C'^犩gKoJJ7&Z6fq`^7YHۖqA/tJcr+.*^ۮ̚#vWk6ׅVWt7:u4z8l401{ ;{]gvwLRzYn!hq2cn2o=qh|Bw_NS*-)LD tR{reyIvu|5'cB/ J.p uxQcGށ>5gwǒVlozeacK!޿  BMn"a4h_JafrgY!k؀-yLy+j}-z$>"dH 0*V! LE;WZx.l]>09+J晄$AG=mr}Ctmݟyz{եhJ?Nm=on|Ty )kXrK RVu Hk:!MT&T@"k%Z3+aP}tRtQVªjeU,1<1գJ+:D8`-No&לL ^Py-x3[2ܞ+Fh>UzT]WhX"mZD)GLwl= OV>7L4#i)QQ*29uIj"' B(rWف٢ʊ@$ 3zrOY1,8u^)杋jYȲd,U]*ueˑq[)iH1h3jW-j9VX-G ڌ8)D29"M,ؐ+]$n8׊G%@JK/(T'R|4ʏP%"G%[#T¸9[Z]GO`5o.mΙrf7Li.Gȵ+f[Rz#}o}qE<.X1r"VƵ]vipFoqwsnoؼ韱g7;[㲱KO M[sdk۔,o.Ga [d"jVTkVT͊YQ5+fEլ >S *i5͠4fP *:БXM3"5X4fP jAM3i5͠4jY۲L2vB2L5LR,SkȱW#k4wXwYQU]UjVT͊YQ5+fEլUjVTu%ngwߏr\n.mKH" JPMHY TK;Y9Sx8Dk//@\92o.DdvF.i`Ч/m^֮\u"mg=iےH$0})`P0l1n=,;꼵 8kV8f2̒,|S(*H\@D jR d%Τ Q娀[Z8b`C{gtFsv︪|ϗ- gVƬy8\+Ł2Ş1Ty^~zګɪtQoRam&pq=m yr |T)wLRn^%bVG.,d5EWtcY)«Щ# 2G .6 LP7p$uk*mpL>bJηI>lG{,TZR֙褢n:4 ./+m2PԴ8y 9N F}`4坩D\r?2&cԒMPSr()h+2Qj^4hglZqvwd^Zgkչ7mKpT.^ l:zYuz>~RA/CkB}lLyN;u#qU >(a0yY𧓞805#%Jd}Ȗ{+ׂX4IlDD'B0+M:e* ڸq.@@$1 L$xZĜS)OA[1r6U)tLI}P>xgO{U*JBA Πϳr~I5H@qFR>!B"q礣8+F)0%EITS-x!gZUUIv~ 8 ģp^)qQ})Sг-s#\"0fFycT02Qȼy0o>IigL}`U >k ́\T̾Aqaٴm旚yF <\kyY)?{WVJR$tܙWmErKN:Sbˋrl>đ:뫯ZXPDX" d6Ko\Xm3Ps1(1ymAnJމK2t)ĉJ C~CJ _pz{77lP1}ϟ+u f}{\Z^/ma;;CmP*QE;$p7J OGĕ gxs#YD`1tB '[r/q{&3AV)cs,g-?_z]f!6eE6ڟZwSYrt() mV9೗ D.1N L|HinaH)YTZ*8Q J"t rO[EtH&4O69 fd<{.Im "BL9R<$5] 1J)yA1dS1xն5A[R*!zhSQBM$L>',|!jٍYIx)X:mpmCT=d~hy&.g7oAZ= "e;k7z["8U4D[|d&Th*dMe,}^0trW )1-Nfj%nqZs.64Fm%td!AImU)mR):PhmhtP ʖ&e!kN^#I#9zN*η bRDԍQwD ΍r[%\1VdA !M6ª1+* 5޷ŮöAowMv6h ;Rc_3A #xzW-ռy Y>ͿBɈ&I7W<+`b`3^5v&[8,kX#=7Q: j2d)qQ}KE KB腼"f$W2ŒcsrN߸Ivg:iY/2N1OD1y,t/4ŗqv[#xQb. +)1OgZė%Me_H55, Q!$te.(*d ^\2JbA¦,}soXvo -]FӳKh_3 =6ii!:Į 9&+q(IUZ-=Ji'$v#@}_o.G{W|q 6)q+(fI??Ln⌞/fzT{\ 2@[?nYǪu4km髮|i` 䍮v_[_>d5̒9EDuv~9n'a@vf8c8[kY~J `{;  \Uq \Bs*W3+Xᰫ*Q ʝ;\U)C:\rI+Xq;J1oV)ugWo_YpD@WUpZz:'zkۗgWpe;\LssGE.ʿ-.|o#UhP`~&GuH>D^ ͜^@G 4#,~~ᯙS Av٤\#a7?}~zI_O7.zJ_Տ'rmtbl⳽~0ܘT,F.X˙7`Y*Zz?1Zh (-= 蔍+Z'lJIY GQL=xy{3LHLkclmZ44.j VuZV=XgG=ў Gm0RR`Ld1@<"\JYbL9f!g*!;z ZFG/'k_r \ܹ]fce)!/=*4)u*: }QԶoӡ(Ή]! r?wbsF4G.:8G.S l%/횦$In!m4I "Q`kgDҒA:*4=0IA_l4lw>݃xx-6CLHoy S1Z }gmϵ-cJ;iX6<,4HXLB,:rm JA%c@aiAE )u )B:#=L>x!I/5vtgi𑑩h|^~[ ۑ\fk_o>/,mޱ/n|q6dG YΡ VQ1Ӆ֢@ &'EP!rjptBvWBYt] YB%{"Rh'*nCL$zz.scsyě("Ee,Yӌc'ERN)*lh"+ER!8Q?VNcPXp!R McL==e۲756`7g%K3K+5+rhI7U:v+~z+LbпX*|(N?ߞsHF#W}e}to<k#m:wϼϚ'Oc˛w-:`ŃŮzdbKVa%wc\i:=7R;XMU#eDW":-0a!['3ARb֢J$N1C Ric늇٭b:!S]^qQ1r⏲N ',|!jٍYIx)X: %|o61"JOOb#6H˵٢AZ{g#Ƴ[oภQDRǹhoMC|\ .B"2%£Yٍa2Nk幠vRԆƨ ;ݶ3Xπ!B׶v6rIV`6hc} L̰6r'1[šy42K#a] ⑜l}qٍs~ W\x)"ƈ;"vDܶ_̽)": g JbPɂ,gB !lUc$OZUP~BcDl&Q r^gUR\qt\x֐raA϶(w>I l`:Ia((νTݣwzTʴo <6^x97)_.>AܘTk .XK Ӏehy{ރi:!Ud4"8bEG˶EӛØ|cw2̸>g|&\/o5߯2lm=rbkՐys^] С셍<ýsT5,z,`Ņ7wy 76Wyz yUh%pE]/s5 tA>XRY)6Nm7|>fkc5ҦǪӦ)B<*8g(3H-\P4}!~MnUcڤ+`bb6z]nIAO;~4f~1ȫ͵,jEc )U- -bEEF8Q'.OdTq>-ٝJBoҫ܂G*$ѳTt7mb,:t.o2d>I@#FNgԋm_7iOO_wX' =i=:zu)PL#utzF8`odq">'kzs2J?C>}+28++r_UYil.cӋ3bUu5n,sRjᜱ+ #ZNmH)4[\9huo g'(o읞][Q|n;H]5+=S '#@QE^HV^=2O3pW3wȮ6+ 9fUo'v1&$?B*ǖ =7 ^Uer4NdNg I"6x+OW^?۟zKF|u}bY{5[wDO؏V[Cw0PavRKsMr'YM ={m1YV&a6*ad1Xv^5rFEKPHU>\UZv:Jӽ.^%N1gK4nJqoݕڃ[㦴vS :Fme-)>R֖v[!nLVg0)ۼI=F$BK#Gۊׄ,ؕr mqѹk^k)KNI8_` -&uwZeY=ZǬJ{w ۽FۜFiD9w F os7]4 4T{}2ឝ-ïDh#!Q 0 "{t>e|HEb5'=:fDyJFZr6 x}1>5qsIYU-wTM%9PJJ-Atr=ʜ%xn:r*ÔxXMt($S&mI!!QQ"BE1k#ZK|S+\.@5#JTm4Jt)*k_cbfC ꨱԳRȮPB@&тw_KͲnG L3RQ/3*1c5. Q[ݫ N6p Nk!O8v/uK D8V(4 I&jeWɕX^4\,݀8քFnu+yX$⒲NSzrzicv2 POil[(V-tC*D4@,-(̨l`W&4:|8+Q57!:i*dL'BtT,TB15V^X@>ಬT }Wh%Wk m :Zom1S;+EGܤ 7!wD `L ӌ/%RC';&|B(ƍ*9mL:v_@fUPЦ F@(ƍͤ5#-h~GB['&!Jew2J"BQu"F_R8q=ӉȼaẀB¿&伖( RDA Ք&:wS."2Hb0Է{2 {xWе?|p.#h8A0(griڍ3RTY7IQǘQiNR&D_ {a& 8/ bqΆeʴ?^VеYV컀 nCj!0pxK $,$\ፍ`Ӗ8St4$m%CRbxLECMZLj,Z茸pΠ=h5'PiD.h2QӪAUYk2!`qcMy H}p_V 5ȤuՅ[@q;*q /]g,:?)j`J4%;+Q8m&k)}Y+A";>n 7m԰V}wu4S*#+xm e,!zإ)G;~R{ЋI4 Q/ѡB/10 zv5t3M5"j,e4 M; D4/^q5*a* 41dkJqc=A;w'FlhWa9MOsu&j$P@0ufU:9 4IҢGJ$v;Nd2Q]h5(U/ y(Y=yp4x&f cN`ϟo9)͈=Xeg8ݐhA/Q[tCۚb xyBܢzҤD+Y.T`=@eˌ`*1J[Ƞ rwc-i5Ö T cW;lE]Q("Ndvi'7 h#gO7(V2^8iB)j z#XQԌbwH ;^ցNi,?{DmDVtxQ3z pNmS; 27 3hҢǚ&ELI'dݼLFLЁha3BBvt!Zg> vGÅMj-z;kxC-:m@ <dNStކ`ݤ1jEr7ǥ!H0r(:flҪ\ >$Ct.XzՒ-0 UX yۨO]ghvE#Bޙ"B_`j?ʯzBN|7;Xe@Pa@ 5* KKΖXC vVqi†!然jF\M{<̚Y'%G G s krO^ +M`%sTEinY@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X *v0N?J (`օM`PzG@c1a?X7t@';8jK; C;_b~܎?#/_V>5oܫ{KpclOfEkZj!7ڦŇBiϼ8X⧵ o4&7*mGh4}0E,]O ->,|W :KNofeqqV W|x_V!;Er6aJ}OGcG'4`!#d%-YozK[ޒd%-YozK[ޒd%-YozK[ޒd%-YozK[ޒd%-YozK[ޒd%-YozK[>[SFGzK,%5Xē?yT5+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@W .+k&TG dbvo@V?JJg^@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X |@-LJ7.Nh)ry}]\^PZ]D ^'= #\4.Zp V¥ \d轧i=+OkþYS+2HgWG/`nb_UYa tp*"sް+X4u}8\շWCY'dsnGfWwK8pu7+xZpWC/jͬkMA ^lg'VN?B8Cry0!$s萊>~ 4JDP9 JsuS4>㫪CҺ'k y~vCb |8)O7NX'%n}yn^#V #a_Ewyy;8u_t3 ‹yvbow\/e%O{7fZm=BZأr#Ys|b_jq~yP[X5HqT?b{^n_Xҟϯ F_mI {FoĹ_WkF?:zgX߯?rqkzG/^o»tvo-Z5[-3:JK\^| o8+<տ?ԫ{bE~Klp/$؄;.q:ŨGrtû?7O@ZF7 \LKzZI<Y=E&A <%#NNޕ6BӮ0c ΎyJQ$UEX$%%/u>H*""ȌB/|ZhKRaT YATL |lp^ YN3p a`Gܮlˋ d(dcj|RGO,2t@乜C#s]#D1*oG݅exYW/nb-YCMhDVr% t|12<-#-sau"4a( 'Ab!N: t|Qe>7q3u?z>}].^"{=o J5FGDS ]`skHd"tJ˴HV,^g.Nslr2|vhM\Diy/Mp]2?Üd O 7>?Ohko/x@,6^ۀ mO?Ðp!G81C%$|Sf:1|ח {BunWTem1Ww*g6KI`*r lYJjwF ќL_HtįnD6c=oSWB>6R'jmZaM>& K^zUHﭣ o{}SFA'M,VV߲;N9@̗ݳp@E܇}JkFq0?;pyO)Ik}&$1𢞱D۟8=vr_XL3 Ya_Ȫ/oc'Mh?ȎFϣ|=vfs3ǿ#!j!Q~Q(jU=73/زeC%Ð?(fQFdR܅EMP =UrJD.]6TJMa%<]L;yaͫ׮v ە9Ý ^B M;bn!^)hU:)ez!2$<aQ {I#^p j:*k bMP̨͎\j7nnF |5ʂt0ovk2 |:lZptJ=Y}q(].]-seVìy:|YW V5#P>1Wi'OzՔZ6n<l[e.zy 8˝{^=&xt@y~ڴ-"{N>VHM'+W;\65{ҥLgÇ6ϙ݅K\ Pwnޑ4x3&gsVʼUfqdnt}asWNY2Yg!JފV[bw|]@0(j^wgN ~vytNL\HdS̛ISr()n0c%3$}gDqq?=4bqNea9"{2ڟ(^2HT0_5/d=|+A328[Ƒђ!ӳ= 4|]X34 / $>oUzЅE , &2Vk9Em\p-bV,,𦔍gAʧ3pL&O4Ѩ(MEiՌ8C%1X{r OC(]P!}l Ϛ/<2Wqr'^%X.R;Aңڭ"ɱ#Aѝ70;temO `dctxx9  6$I.5Dؚ{#֞2a#F`h8"81`&"b}|kQ6`\' 8fح-NdFhݧĻ<3#Tk;蒣\Q-%ƒ tQp qpM XԂ!"AdsGB1ȸo5@ /JYLg};M&=Xym)EqE,OY*kCJ@2bʟM&Akhrm~=2B|(b82d vˣ6z; C4kl_T;σ8?ebE/$ޏ=x ~n| &w >zꇥ!sr}?WJM%eW]3une%B0>j#-RDF)aP¹.)X;:Dg2 ^[i!S$fӊE%(-EQDsH8a.x!kirT",Ax Z`'p@RL=V`9xwv>n9rfo]쓐[Ek)thj3B6v@ )K2|5j.H%eh 96/tq :WܮyU~˳G;C!ݭk|Ǎi|+O#]|奻P/7R-VuN}͉mRUvөDl. Ǒ飀#I~(`j,\^LeW^Le֐s)ֵ;3*N-;|]VJSwjBNYTU,*EU͢YT6 aYT,U,*Eel͢YTtDG #VR){x?n2n2@j_3J-<{?f|ϋEλoy(Ik:AqW?ZU}RƂ\%ˋA ˽j!gE8o'=vccȣ8fǴK$V6ىPI%c!s(qiTt A(czfl 8 rIē\x=qh})J*-VLkhϽ9-->Gsu}sʄ[r'*< HDRȃAr(z}A1 "qHc!ǨE`hD TD%eIGqVpP<9'hFXa{8?ժ("'&}\ń =kj J1*[R# 3@HKbA!rг=~@gj\<蜟mo$@fNoej|RGO,2t@䔚 3GcT6R/H5"ۡ7=MuYW/ٱ)V5VD .OFq4|!.#-sau" O{lΠDuS·>~Xz\./'*atL`RcX36`Ă qъPԸOO?'}p_c/O:ϣ *cPDPO N74VndAɁlnH8vOIPb |ex?L{c?H+EA1 xu%/Ycفemh#o;j6Z4G*cI6% m,FgzVW>2Z}veY%Yt챪.tKH}[mn9[N7h/^Pp<\Ƭ$kiFȢgFK p$9[vY&ce(6 F"i '>rv(LkN)/M^J&gMz>'V>+VyanZe>-eM 7wfgq+UCrz}R4D}ѧdf$xGRaIdYYU !9:$F(BN[$YB TiXLݞq;J9/,o/d,iOX{rO׹HI7~{h8_|pͩbҌ:% 7VYv [`̼b˖ix^s6T}N *A*QnQc^ejdVsŴ^ٻFvWsDZXDd&/FGYH,.ɢCKjPDRQ2.FmrmaRHiG R!EVQ`'!N !#3^Q!քHdi/8"F 5TģS-]r8p/"30DZ( #XKpjk-iW\d"? 2+ZM^ED)!RQ&IEyPReTGOhCaD,&i@oC\,鬳0.ʊ4^yP.D3LB%0sX}8FtH,R x \ ӎ:Oa+4_c+EW.-`8%pi w%%\+WS!6 Z*fi{,RZ!T#ߧ|97JքF 52grXhn󃟼L?6>!$HLσtJ .׵jM9xXg1z3y|x~1%oVA>my&ܞO&(AsMb5Ihx0#%4Roj13tw $|)CR6G_-$% bIM w  ٜh0@~ Fwwv:݋ChPc > ]p7cao_r$u{r7ݯNBYuY^0kN*mdHVi<GVg5%,vSvs:Mુ~v{߽|_&mGwy-%i/K{ _p.a3]-seVìy9\󬫅n V53Pw1mQ}wNJw uЖZ6?./dݶenh&#rkG6m\p}J5#DC+44qE'gHupvX~!Q,=;Nv̹:tWPKkbvEjsB[[wGNc~Θ<&s5$=쯎/ZlΦ8XwR: QTu,Zn4`x2Oâ&錷I/`\%υ1Jtad$DS0Ĕ!>J.p Qj^[:i5's_2ƵW,KzUM/dkBN3|z] /U(91w}<~e({}^Hɹ\g{/r&YfչpՎFx:E DK VH 2H%^Gow{!?t6Ž,%ݔte!l"lehNl4EjQwV26`94FKހ|d[QMǬة1:HfpJHb.jfIBur+pek ?΅:1 L$xHseuLIEx*}ZLDYA]#N|SGĤ%$ӣrELd9;jor|Tyd>qaӔ SkîdY\R ^,Kd_JQyEpj Մ\ \eiO ,%S \F'WWY`e^ \U W̞|#Vpu. WI+؅ä\r{W2  l򫁫,P},p% Y Dk6=_V9i M/{6\E;a6PA #sp(& )J9jrO#TZn]nhNdi\Pƒ&qHk-C0RZR.uфS&CD 1Ṥ>}h-FT<$jw\8m~|H)$." o516&gh<ûS#6DJ7`\ۛStH:fzp]A=!&j)hT/ s?ؤe0Q"Nշe>$%R6%Ƣǥ.~}fś(,qx3q!2ܬ~ypm~~ok8nI̲ߦ(N;,x-x|x}0*UBmKІvuA]AȠRfnKOߣMS%]tu)ìszYV ҪjM*ZQ~ GmJ+2+q2J5=߮\p7-Bcnh\ԶqhrprΞ++&'f՝QQ*29uIj&gAD*u³泝\X= +'Ig[-\l.، :-z&{qq$Uͱ˰#FR\?YC[e$l| P(]98GDh  6$IPj%Z #F=€Fj|EHI0IL* j}Bp> O([ dg/, o? q]i;;5̻df iV>oSq<{2û,g΋c\Ɉ NhmRĢ  &kd4Jc,o5Rqrb$ /EJ\Lg};FӴ{{j2(dŧ~E,/Y2\Q $eql,}϶RJS-`6G^ '( tQQ7zۀ"(3ƢWAW=AD t"M4x.Kcr ZhTiuUɣeG]#~T-뜡kY"9*QiOrk1YM.&Ξ]sl \,lv|kw3ymH+::Kk36cvt6g/u쥺&dhYfmwKG=b?Kwq5n ݯ|u.O۵f~\b楑ϪϷxwwVydۻ~-YpQnzx9:NEPPIx{53jOV*S\}zXKAlߓV>sP!~o:n3߬ߗ:\-s=vJ.D8'W-* ~%+ٵ$ei5{BP@M A2!}D>\{y#yBwyBGyByBmSJ m>=}Z?ߏ4$h` DhyhY@LJÁ:hg]ekQ#h(x#84ŸZNѾtR6x7ϡnϭI΢'+gdIa׃GyOϳ-w=U,wxUE}b24Uԃ1,hNq;1FrȐ$IMuZs$? 'zPH 6X[J؄ UqbXXL3B^ y“bJ {f[ro:5m͋?`c0~rtT3GCBXAFP* uP-0p/زy@'eΞ pg6T}NBPзC^dT2WtP)qv# Ò.Ǣ6Fm]l #ug6x)$Pٴ#Å )S"QuR0ezR@/kB$^ȴHQ*ѩʖi[Lx8s!XDQTDإD@4+s+(7DER@xeW.4F/H.FSBNM򠀥3.8!(Q{4!*ˆXL=,#j8YKYg1-9Ua\+.vi MY͜3 \domDDXI"u'Kу )p/xXL;8<<)y?Um }`mE ꎌ^K_3Xg^=z$mEAiJV$A5! fIXRh#]#@^6Y:'bsJ"FWIh%ՕLTDMW?.~ߠ3?5(I^vQO:kWIXIVPaHET\yiEYz BlF~`le# ?}ӻ:Dh) z_i8NF_zzm} N~vzi0煩nCTZal+wֻ#hS,,dp1z2gW8Q/3޶F׫YCs@eaIy˕7G0<ٶWW%.T+X^NhorLy<"o^OzC[O~itݧBBy/ѾM]^p)*.~A}O~W1ބ:x86ofdg&Qxc߻9~(j4 {)JwDgOF`a0n((b;bf- h+>2-tDwCe2Ifp $WT1BdS@@ί\'lr8빱1:1IPDBgSGF☘s(cJ*Siߩbl'BR=\2wp?b\ҏ${1 .\.b> \٨O{;me@_Gy?fa[P/q1y ~;C®ǫ~b!+0{+|I+aN' ԋZ2I8JC>=3sEq<ϚQ^zJz ?ȫ?"b?|JCE >lkͩZoOr{iYMQzT~6}C}F2{lfO{J>0=y4ta/{ԋ5mӓޗ\͹Sɿ}ȿ61r .\pjꝫ(ךY ՚wzv1vH.dt}^ڼsK^iJ݌;q'2TsW kdelHUaSLUyC!I9͋v7|]TǤM 6Q*I쑴i7(`CRztjܻj(pASK{{k;{kLVٜ@Dce(0PUwԦ.A3"{w\M\V6xwP.(I|ڼzXDuyuff;C$N-TQAk m1y.]K ۼ-QV( ?,.Dzr[YJ*f+\fz{c;nyNDh.!*ZE?xNw1yS%PlaCt_>c7~q}Mfhzr8T'gљ}F(ٍlA#Ŋ6,8{k:Hk ? 6NrJx(Df(WÏq/-8ppA#%\֚=Ge.%AO"gQr4:c{tZ#(ah8wWAVCYJqȹIU\eqh*KءUr!@W/ӎ& 8>jW(aϮ :zpe-ÕN,.c,>xB))|џ \3B }.wB@r+xڽ{ *pu_),U[j!O2/d|?Wgɹ.ĭLjw#Ϧ}ԫ>2$EDoPTU %GfN-&-'KYLK*wҩs!JaNb[ "0N6@Dڽ3*Vv!ԅ9$~Ux:t9#\rڶ%ࢢFTqmQOCQEJJ8 N9v>&Z9| yv3Ͳһq^Ba ;u!։Ozoc'(߽Wk7c`i&scU7NE!|&ϯh͸ebmbm҅8[fqv?e֥6#n$;"),_iUcUͭƫ @T׉s B%4ʂ Hx%z]t.qPG"ўq0QDD,?5 &iTCh+nּ|ejz[y.9&Ӓk]<H8A!c@֡5ۤE5 akd4Jc, o5ɡqt c%T[X]Wg9RIZW|Cp56Emo+:iz([t-eY~ǧreˑr[)4Xsjr'\g9vcg9h=Aț"M4x.K#Ȃ Dsx1KQQwG e?bpruZBJ'd I!T¼8--8Ծc\Tb`GٸzfVێlV׺Es5$'†Q:3A.[^O:vYk!761'Yf"\{=>A1O͆Ta Kϥ;-ld./ެ+Ē}N7W>=#iQB!=ݼs6s-~I=oxjoXmuCvԜ5Vt>Yc_Tevb7P [g?m.(5 |F?mD;,o8^ ;GƭY繸2qy.=Q'TFt?@;sksݤ 2VJU:[9)@hSq}¦N~Rwo &ƁqѨ(KEi՜:$5Xț":,[FqX[xm۹7fWMw'`Ǖq^bΰ۝sQtq$i dرbGN@A ŕp,zR.2B' L+%sKkO?~wl@p&mhy8))-rOkѨd97΂QJ!TS <rC<~ˆGm-jQ%ADr &((-DfrD BZ rn!&"ga >q9|rG-م98rXvߺSM1{<ޝ-[GN"Rc&Hjɨha$1*)odeښzt_,ڏYbK2+m41HwXKȽ 6i&;CY9;>Y&SP~9`AI <a ܙ BDN 9QA CDhS蜂D7;㞁{kw~{1j%Mm۵X;DT+I_ö Lq;Jv"HuTXR%Ц9> v9 ng`4\{GHr u9 1\RBw;5}H;Ͽ?khLQQ(F3` cQ9iF"0ɣ5&7A;,S)2D HGH@OGPIT:N)}iRlWnT!-!ϵ*QɒMVޛNVW׵}>"$?#*9BPE!Fd.N?u3HҚx<$IeHR9wF)C^{ )cƠSV R6!HFblGr\b!/¥=9n522S՝x[0EҜ9FIDP* uP,-a^"mEkvC6l`;8˰ɥ9)b&ĄAF%x #v1q#‚y(]lvڢ0j;iƃ: g69A#jG" RSАU: ̋nz i-ddTyT5!/YdK$QZ(hTGeK.&vd|r|VCq;qqm+3`rΈ*{LJ"b0E[ b>ZiŔPP' w]C|>u_UFmApc3E?u~ C "Q"8 BN_hУݹG/šퟵ&_99Ni]~?dQ*vBoU[mMŷ05͗ g=:H1qYddi. yUL?΅{A;z׺pfZ_&1"O5. E;`[%0XjC@BRȨy aFkC(ʥUR`I-J2Ĕst҈h9QMB~F-3x"6ȫE[$Zmbe{f}^oto^'+U>Hup Z{yyљ)SU- p\&aye)ohM8V.ӗ ^#MkWS/3ڿ޾\|kۘ >hQbPYp`Ě5!Fe@U5T|N< O6;y:PLvQo*Tn!JިUZSTu ` aΧgǻr}c._2ۯ{ݰڵOoFGեE`TnٗMZwt2걺 .mнեTb+4|HPzf@v  USm0xJ}U8c`.ug??_͹XĢCJM!i֩hcց: TU3kUǺ"Uy];\>~,-^dYmKiukAuzw3QyZG''ǧMMe^dv " @* T*Ac }|@7 # Nx]BA9WvXGJ%t5:\ѹklicl#qQչc}\6*cTMJ( zT>&& Ro|/:$V;7}PVyN} -N!;,]ɏ%=j ` IDžR6INͤ|Up{s=P'tUDgey=< fstkYn)ɴAEEHαekZ pV,tԆy[.-DoS.YB̐(PH-6.6>-DɡsN|gh>(i\dZeGg'7ZX?;'! }I7|,z3|/O1!Sk+ǒ؍AMթM5 !iAQ@*#DCCk:{RdٚAbw!jL-i۔ >W 9;yc]j,[Y+UtJV CV#dySL٪<.gYYWښH4kj@[YkRBN|Y 16̨H}!>kƹ{O-S>8J_^x\イ Ԕ? Mej1h~kyǚG',Hr%[ cof~~=Uѿ/g|pØly(W:1МSh2իw6?W?8fIfV-o.W?,*0xZ~W^du7xϙK*X9?\{Uef.u_>a໳\%gq|_':;\έs{Q$Km׳EśC]Lg߂RFq$pa0f0~0WOgG lv|=1'&^oF~Mn|\:o}^HIn]A֯k_t|8ng~7+ssv]7W%,#9iVWdN>,CW'6td^V,_q2[>>ȓCQǟ~O?|黷?~x?&}7o; ')gٮ^mኪ/ >" &gktj8TáE 'k2X}:o7B2Ÿ9 UC>۪i?lt֟ǣ?u,B%a! Iڑ1JN JE#GK{GH8,T]Y'%ŌpTc`Ǭ4jQEyZBBԭnkAEw|[}}q1K[8ocO>!*% <@JnAZSܕG)|du_!峉Ve2:oQIPԾUjIvs6%1fj⌝yˍ!.?"yO3|R-oրe}y:)3QLO1?MA6HE:e#FTʛbO!DRC}n :_(ܽDqXg YJt)ATA[K]1S}\KW։]5Atd. M%SJQ6+%bնM5Q"M:sOV%޿$աbB P0Āl9DL+@5+}3%$JwUTѴ/Ejj"d( lb%W vnEw'BK?ɌN·|t|+G ɾ'/N7]/Kc\_.5yH*qkwXH;_/ ?}/ֻA:C.̹V`uI6KN~r6ƨʹ y^U}ZJME9& j#ùeBTjB.^{KR8w#c? b팅v'uQ1 ϗety N~N|/`P9R5IQD)vh MQii%ԭJ셶-Ghmù{Tj+4شe䡠v«(Ue :zs7bYbb jw[Em0`x}`"XBcTjjh/µW5ɠau)4,PX2 V6!{!&)E*Ũf$ꌇĹFR]An㡈:#qDm+^Tt ">֌6srF[  ֫uA'’Цܷa Z) 1;:VSΤr5Z9%m( drn܍|Ipq%Kn䡸qя8209gD &Xe1؍1qt,c2ŔPP' w]g{@bW?Pe"Apc3E?UQ!D~ʍ !D0PQz!Dh !BX=Xzן/ "QJuT!H1/4F\yDF\f]Ϻ4cҽ2ڕHXH'gCZVx(YP}E&$O9sW'll`\VT--_`YD@TU&=cOA6_.E?X}I\5XVZ۞:|-;$I6ok/nϩyo9$ /.T_Ơ-攘1BH U19"AI2ag"Dmj`J B>1g=:H1loPϥrUEb`Q[$pS 4IsBѩnG 1e} χ";noH}Sxl-t ^.صQ V#`C@>RȨxa#K+ 23@Z0` 2NflqcuJ\Q¦l+Uw󔚤%`.V^ի:iy^0Ƽ[[gz\sCDr.e><Orh  Ǜ#z`A%8/ 1d)&l `fF;I_NEj0g7mD $RԶf׳ dK| ON0 xCw/Ƶ oFXQ WOl) 'Mk#0KgO ic,DjsfrlW'Э,m}s5`5qa W;^ VHT/sŚ<쮦p;}㬠Nc5SķAsLqE5LTu+_QЄpY]?=S [U}^HWޜo,md.JY&s#6@b]\*cnc-+U[&xRTki) 4gF`D),e2RZEC*yRpc*h%{-Hø)a@0 ^ضlcR<N\}"Ͻ^t B j zĒxum8qmV.Yȸ a*H:_t <pw;NsM%rܤP#9Q [cI"|!tG]󿹘[qm z{vgV´#I9z)ۣ||gy,`G}D}(,tʛGGa[ A!rFA*gRbrP߂#A!At_$ѽ $Z+,vLpMOˀ +8kn?X9l+簀dvpLg`VLЦ<3*k6l%7)| bq,`ꉄ-:T2,iqw[0wM]%M|Yj0X,j3v́&9+fD1쇉fSGLjG▱-(:<"3tny$8Q^1O+Io$jMt<Sd_ s9,}_ YV} l ҒJ,[44r}ܦ`e` dc M#iS_ ̢yTb1Qfk&b;Vul3޷c:sjaxus V nM[Z3`$`-'IwN0b1pLQƭ\*#E $0%?& N@scW$c%@XfĩH 9V+1âҌ;PK10#. Xp=u>*Abf:Гu#[+@v RA7&HUH0L@Aq?phҴtC=3G/P0i0&'K` tB+%v剴{,Ք6(cw̥K"So9VQ\m*M8=RY?dC;%?k>wNJpfz}_mI>6vҎeg/e(d5:%J5.mڃ)W_g\&oʽϗ@eKd*R8Lv(is 0E_JAa^sSϋ1y~7 ȊN[lB\:XBZCEy>H( I|ŧi84>dܩ{s[SIJв&H[(q%*9w"'?V5AdZQ֊tbt1RaSP:uV%Oیg]kCۻU<9{D^-1_{]^UmKm6` ~`ڸ0xj*ƒbHs1&aQЂJ>' ]ƫ*'եN*Tjc]lZjd9 G.e}W~,툿8Vf*4Ӓql)tہWdIs?MRTwB$7G2h覊+2WgT{UޠzLNj~u/>~¿@KgtM"MmEYP _ڼhZPޤhEsfߠܹyBRBn| ! 8vDi_tOBhak&vv(M3^EIHeg7s޼+;=}2QJIޝ O>]pt,\u_`Y+xB}Hvb ,)fvwa L R2j9EpAۻ+]VHgv14ԁim0%»fht9my,L6̀c 56zT~]tgUo7f&5yUVhPT CVxFM\9`4fMEM^{i#^`~^ 0mʻgR$iI"z -K3rf(˵4g20ݓՙ]ny>>qP>! bpDSq mP ɓ%j- Z;"@eN4+bqyZr}9^믟;m8īuc3ӈ ߔbލ{ʂ, ~3t< :.^5B94js 4^=s9bꬪ:ة48ɨ9Gz}\/4}"WkN" ,ϫdy&+{WM'ZyJ{ݖrޝi/9v3|%gS4a?gJEμ4Z2^VY(ZPȵ1O.^L0&;EyXHOہu P0vY4a#A29Hy O^sXe;hv/A#vŶS&P+=3F iG^jbqhիU2}< KxTRQQ*DEZ~seKc`,\RTcb,(3x.[V0bq.6ac0Ng:hM'A $aQ[MTn5,[=98\[2ζe73JAS=4[t# 9,{ܴiђ.;:;3|nwCjF{n|0R@, YsLρps鍏KYԙEvfj 1 Y9R*"'\( gap{"wHv" #z!Uxd!R&R/5eDDL 6`IH[g[#k\~;~!V7Z.+MHc-J3vfF.BKu:lIUHHie9.RUCs5NӬг[R2Gٮj9ϼAs)W57yzSgVM&ZwY Rvs^p:4'՟$)bC8vfrݽBRU:@J۲' {-\BQi1A $s>:\?W @ L9qW -G*IiW҈\꓁+u9x\=)\BK|J`XⓁ$W B*I9o\Չް9LS$.'I> \=IJq~\z O x?p?_o5β< Z(jSwgCye1΂$ -q6IIEg)J^\O@`N&q>J?-LRj++8cW 0Gd*k*Iˎ]%)EžFSt%'WI\N s\SNN \ 2ٻ޶,W`?,BAwfNIQO[Y҈`")dʒbvвM.Oo:xtP՛+Fkz9ۖWóʰ`~#k򂫗0n 휏a^bWPVޟLqXtW/rYymm<Y,f!Ynu '1bePLjbKEq60g3zyPbN;QE?#ೡ+_uEtň:]JºHWvh3u6t y.tjrt(E]9]\ :~N9>vnp:.]V#R[! ;j[kW^ۢ;*$7HW2U,Uń:]%vtR9挜 ]% ]% PSnUQ-8>#Jݺ#ܳS8ۡ+P**t>S1D%:]J*TGWoVӵyc%P?2΁;cedwpj]AݶEcgmI gZr9Af%d։0NRr.cmB婏 &^X+gHe$ ]%'^JI:zt8gl*=.(U*L[+ͅB*풍U+ĹUBO~HsPmx 6 y,#gVkӦPq;j[U ʻU\ <*HWRsUu𽉙-d]]Lp?:eu ,^@%ߐ>J2&Θ(3GN #{6aGp"xXƧ xߦ@V{Q*d ~Jyy_ip+R\w OvX]ע++|<]y2K?O/}#έT,:I޻̮{b җOƃ,uqrX^by5ȧo"o*YFySa,]dpS?n`\ܧ0^nݱ ,..P0vY4a#A29HysQ b`~qgq"ݱ>yV+/סv h(!.Ի ʥ{i>_z߆F.xhjj/O_%ÍLkYZ#\:=qq/pzLY'%+wMtٺݤA_.2z?4u^5K .5x fP r?\je̦vq{Ycl[ ?.;Tq(ݨ/_6P/ `de;@> W I{=. C,hdqNV&.{(&e@_jh#Wٟ0ʓ0[z.Lg.{8җCkSdAͪ҇0 5,j VE;&BʤY^cym.lG;}SSHc:OA6d0fZjobkZjHZ]]!;ut$u5atIG7f(uSM~XE4MUUV߮c* x,,+GЈ)G(b͑RB V y$EEUH rH# be}0z)#"b1+ 1&"ݲrl vX(5.L#w2U Wve6ZhY9uY^[ѕoKڃ r:b6Z$Q*l4j% q˙ S( `14-Xhm 4wF (G`@ :Utȴ܃[YA?dz&{v_,&&mM'_U~0SH~QJgP*7[.lx, B Jf67yY~,x4\+\z{AmowYbjy>`œZym9iM%298M5Tcb-r3HԵ_fX E>ҲǠD+وmƘԙeZв]FCvur!>]D{/ dP SrG8k-LBT =vw!{!>H^}>~g=;jVΘ$0dQk!db9-4+k>~KI Q;7pvY$PNbXXH?h E90}:Uϕ,=V=io75a{fO@'ۓPas>a- Vx8BPT 9筂G:%ZA'8 ,&x#32x5sf"T؞Vi [}̅ƒrj6)jHɆCՍ_p8w,%T}P-H#! a({Nl126U42hOPh ׳yIb;U0 JD"rRحv<TX5ؗYˬ:&Z@fV{sy}4qv@LS.|܇Hl1*F.qVq BeDю \r jTAPil"߅tC"Sac_F-3"cMXc::IUZ0Sd@+vq'5r?!dS2V!!Ɓ3uS/gN D҄I gYˌxV[|tJsyɾ([EbNj<^:&-@X#8ICܘvG e'^EP ғ‡yǩ܏@a-w2w) ~:2lhԌi$Ha5(`dӮ׷'ֻN.K5}hC8$1 Nn`opdKz O*vi/h-FKך,ͻaR\Cvbkl/VtIQ(uU .(7a÷2 *O9h%R .2Z!Zr@)WGPIalŇ Qвq ]@k`-w?Ĵ D2J@%b!]C2VBOLp33uҢх>봁uȞEwFK&͚g-_5DpQs":Ɛ@zadk]Jzu*~*:㲙΢{Zk,Jwhn캤))te}J}Zp;ݖҾ)g,-64Lxe-ow}`K !UF4Pm<'0e|[YLV^Iy7s;Ns7h#3ΨZnn ukJ[NL"JS QBSƱZFG)Q 5"0@޵qce_(mn-- >mmɕ$b{8ei,K6ĉ5y!C~NXK.KF#__LR%8g,)N* 1%g((@'XI9/Wj(<}:C8ơg?bڏt'->,HSZكB?ۓx]KE- +d(똏k^ҥ<0L\ҥ`ĞăwCYU9)1ܨ,vKsJ-vQk'bqɒ&#yC+$LQ&5"8  Y:NWR>Pm?ˇW!S~M UGл4 WruZjк `jкb ^}&~p7ӗ x0<]Z!{ _y5(asDf[쐳"৓;n t?à4ۺg3`1-$M *8d <#r9I8e6NB81x(<Ş8b4DS*0K]Av&'a.Ff)(C3uZ Fj 6)97/|kkD/L|RGO,2t@|)5A#%IFELgyF);16_7M)XdwW >YBS;-1s 5<}+b4Sf$94ɻp'υAgBu`yNp" q rș}S ::D_gubm;{la.=u.x9 PRj,! ɩNtJ L$}F+B+PkLmL'_c5?>m2ZrwP}:j kqU-}ݻ|W߼~ w=Tm3qa[,E ֭w`Z1C{Bg`ۇ>7?3tl3?GKϻψK)@Sjo2ln@gz;zS+#[$Ee̶  9ȺL "ȳU< {B9H$Fwgy(NK0G#bl, &};!"TM!(W!jbJa '˺̀P 80`8sx1& .(}*q:ng⬙g6`s_7:\Β<F~4g̲M-ipc~$>-e&6Y"XN-B`"I $J=> $M*u%A { 40/oFvGNds&J PU*UI.<(XB>y(.S  b?6č} U?b]+R$)!1QENT)&+6*Kt΁ifpXcdؗ,tejkZ+e[#R8%X4D XSPF(s|KK8" Ď2WR iH!;f'`\֚1Cd_K a@qi?RbX!XgWcaTp!]I\REk.8jeLq9.q9/r4dWǫW#) wN\'1YMBDWh05>>Sv4Wx4%xu/ <%J8b=وtBybG3€i}?;ͳ(u_"blՏꓺx[]ZLF FI}&zf;1VZ3OJA뚤:0q{y3MtI!*LR5!woTIiW v]0Sk7+߻>E_pݻo~>O^x/] KeS~Jf5 Mft5U'F 렃.;~, р!67K_+me4e,51`-lSzIRp*N;##2ڋ!峕vAsO<<78~dޘ(bĚ$j"lBRcwb Kٜ(,)eXY}6I[VrӝNgn=pĕasS菚\h?;`.Dp(p#%C\E|&XQOMv1>~q:t,~.YGNZ,=5%L[#e&c S9G1WC 44Z/Ĺ1>+rqPkGta>,:IVM@gR2"#2YGx/(̦8Fp0b9@F"cCBܦ)`<_D}]!;!CFeZ1wAl"&$1AbX&$-un![ p.?./{眸.r-H뙐V{sH@L(,‡ 4KžR^BPBc,+|0*K,u Օ$Br.jګ7s/e;GNt|su} IaMRY IPY&y| m>ω˃&m0\W7~ѓ/;:]9%hntY@ɷvl h攥b`[IC21aUXQ3*>{Pp4]񫣪mMaތFG*ۣ t e@ٗʾf,5˯g_lIϖ2%L+Sʔ2%L+Sʔ2%e+Sʔ_ʔ2%L+Sʔ2%L+Sʔ2%L+Sʔ2%L+Sʔ2%L+Sʔ2%euIq(a,PK^W)yeJ^W)yeJ^12%L+Sʔ2%L+SHK3kgHkr* 89/S}A %) dkMrR@WvkW5ᨖ@T cb)&&T\```"1ØV,(9DiwG)Wq\9Cr4aP9*'rTv;;gM˱/j쏇5%R\oٸ] =3^mhuήot;+}0T]gȓm'Uu0jC:y+E Eckszt b_2keJϥQSzu0<Vf]!DPMfծ/j~ IhJkotμy4֋6g4_}2?-z;^m65ݟ]7Nievi.NlF;ԦVlgͅ>='w=Kd%K"~A ˽j!gE8O'=v{}"wm_e^| _! Ea| m5Jv9,;>жIl8CC?#~lG8hpIͳfg˔ʁ+JN?L@V,(J)uTs2UYS:fAy6E\z>[]߷gOv&8>O#NRy?PA ¬@"e6^6A R:~BҠ O 0-d:WQkIX Ɂ[nZ9t,' k*A@,T2Ud@#(3 ܐT i8}s4}ߔ9 jw}CMpeٴnGMXr嚓NT0tTF9"sMP5Y("`fczFL9d*UF9Q2Z[Lq%HH23DY& qMvF4d\caVI7pSCQQ{,ϛ%r6uvA{Ud*L lqr!WVCOUrH&npS~zvh~8S#2cZ8Kg0γ&ːYi}% كr'"N(%)|,-ѥLMY1ŒTǜ8qB=}Hwvug92  %@BAQ^$R#p92uqvdIM"8\dQe+gOȉ<"#A` vZ/x2oȊ`C]OTOb}zr.ʻdX(TcڌN.*4(`Zc v5^ ur\iK䂠VRHs>HFj܏tjXX3BY 倅7B/.f=1uq4jw/~נxq#6Ôg`X)OhhGCe7ސԊT1&&׍2-C1|TԆ .LUY#J爙Eck"~i01qǾ Q4m K֕l}`FN*rV s*soZu`I3@Vthk 5)QYDdTDV#~k/*6'Y[k{?O7(#Xwtvrj+8=2Mach+`,y<RGypN3BVSg 6F-wLp7)灁FGmtp/Y4 ?3/c"DCevS:$=V1!l{vCߍeQe>sx' dk-I)3wBD! 2G!ֆK ̀l< O 'R )18Z!$JB$}bq )LJjA9NfՐJiHa@s_= 7f';%L)*"` >GAx%*F$15K̀aEAI =zS]CQ#ꓔږ8OપDVҗjxeR_[L k^ kB.0ޕ,oQqV a1s:lRI:0K -I@۪!0vI6{̉; ;oFrfbv~^|'/f~r~D] Mj?lQFTɥ7r :o8>wr'%xg3/CrD(ٗ! o#6~G(RDnjF\YeGOS( w5' 6ѬZ4bAر85 1/j 94cDNۼ~B;~ե[~aZ9I@N4?gVrޗi];" v[rtsel0Vzhݺ{:̭;Ze#ܲ5}׃sʂ7_$Ys[w[_N٣0 1[}mk5󓜇ݦ ؒ?ުa[<\J:-挝b#:Ҝt{fNͦAfbRa ehgv+>%`I Dz.:6],_1W'j@+x9`'Nݣ Q`8 4dDT[xs]>j^#RX+ )E:X' ˖q%AdQJaqLQhh5N5N/ Gy^8*\sn%ZXi`d sER Ahn:W80f\7{Vۺzc!!aȷ q4}D ,գ['@?/.W9 jT[!?ؠծqU2j+xLLÆj+b'-qM'JQHȜ's ^s&T %Ά٠rZKܖD4 GWEñ*yP)DfupJ0LZ{j{ˇdڻ󕪶1x uEێ[ɾ >dnWZo*0]M%fWF_U ǕBۛv4Wlv>zw/Fx @>yY77 !Je2{D.K -`]୷AJ:Pΐq庒ɯcҪ5!_TP %J+Y=ί|=lIPN=K|]Axy.fXl1fŲYxȶ&9 Ap  &"uE{ Pӳ) 9@4:PYù!@OPhq*-Cq);@F> IZ*񆬹rGXNDNb4=;ČsX弼 >+0~ޯ~e=zl/5n<0w'2&G˰qK@lb3.8:gG'FKx < N\|jrtL]{uz5-8m/w'o ᴽ\YO30-oq ëѹodJ߮~["7\v/?ja-9aNhUwMM `\6/i8-תTwfތniz5/׆k{\Kˋ6;QJ51gx'nm^ۍ"bϗȧ'Nt~ w$Pwt7F,INg-F v>ͦB//MN/`G]Mo޹:`=x(JF@sbfX\~Q>G|4\ :o;mq\#NۭoF?_~x9Mߍ.\;6}%m?Ko?MʨK`.t%h2UM(OIH^w?~_}߾[.W{/i/ ]!*g g݇jCxV欗 ørPa=pk\ڏFR !tj MxhӒ&?`_ `bVySlADs &y$Y\y3Fb3X x0a.M+>ÁGE<F'%DggQ!2˳ps2x 0\T9jDexOxcDID2wxqh/H\`NʶxC;BLׇ`󷩓[޻~16GWͨЬЫqǕ7r=Ūp8떴ί5JFXmϴ^i%X .\*YF}iwytz[՘k+(Gnb!ayp9kPԴYM|{+SU.0ҌyE)c:$1̀.$],2RɥAAUiUʢ0_$H>YA\fbHtVymЕyg5sT l~ q4~kt\Xs"&g\F#+T'y"!bo]B잪U'eRto-#w6+[f ڟ&3OȅJXJ7j米7s۷{5XvUZ󡈿VO¹yR =/iё3=lSsޛ\thV}~>kg thrhs͈+bN ^Mb%9Y+<(J}wŜ^#;2DRi|:wҢQ Wm S}g" _oV+R2Gtusr)[2R)pJҴ6֠ .ӊ`vu9пnb#oWT_A;&04@-鼾1:$X8P_d ]w7>AWNJїFR ~8s/ x]YQhT3FgkV?vOV lV ^YIڈ}N8[֩a*qWAXhX/l~m@z#WQ+_#lg\plY7꟔MuvRlNͱ?z/GcGr͗Q $kL9JN6^ "d#"Y{rԓ%G<^{w \Z(+B9:pd*Ge>,4U7e0&ey,2R(@(d 7V3g?9"LHїҍԃJ߂sif_nt!%.d$!qAj3 G% %]7"hOӊz==<`*~7Fhv詴"faZ)V/>jT P9WQ9*Ǜ!7mB @x:o*ꃨyCY.We1BthLGIz!&Z9'2"X5>FP։zƽo=iwwrDm}]CfOŋsmz/DbZ9Ȏ jM5Ը ,76hkA v4\{BwIމjx.XȥLLk9[Y1Ţ`gU;ct@)eO_zEA 'nV%ТPB 0V$ӼKND HiLj:)3V0=״ IeHxZ$FH%k7aT˜ 5O[i@rV'O{fRSU/#=O؉!/9 ĔBPl=m^G}CBBi7YdM1{ʉFN 02D.EIQs- \&ُJ5,632 a 3xýHrot}7oGl)A)< rhK%e3ΐ q*E-Fɠ9Qfn(bEJb) lJm=!aɤWe!e \˜ǣ&1O͎6TFmQgfkU6-FD1;ϔ)K;1R阳)GT;ÝJ릗6.xTH!3@*:dkbbA$0h"QFɸډW[Q9+x*Xm~<UeDT="k3#Gms0{$F?AcԫTH#cѦU"gL mHn6\F"F& #iBɦ98._$jO8[ ?j䩸h*q]30^)AK)fHɒn,U"("bDqRQx \<<6;NV1O˫x r_p4`&A11 ki:4}wpdkik9kwĖk 3ʹ0hpހ1Ĝ5&f%N1zw-OG[|U،4=c>*Zgb;=cei)>sӷ)Qk@PR}L.h3V "9cdd,:|Q͜}ҧޅ6үM锼ПZ}>~wd|Un ]/cϡ< 2f3Q!"S2Xe |NZsG BiC / E"F<`*~<2xkWY+JlWU\|=<;g ː~:w[TD/x]xHAJP񈚈%u$.*ق d%# OON!DghY0VPDKtH\h}VQT(%1qy O. ˼?ߊ$Oր셑կzf5ICFN&?IFf01[A(qL~W[wXԢc?jB(RT[]R0_e/ ݯhfpo()Q8DC5!\g˻,IqVa!szpY0NBeaM°xt]xzu!I1+gٕ& f7QDDnjFܙ٦~,SF5/Lkk֭Y1ܚSRX&!5p@-#3Ҩlk; 5tSYqUd-aһGm4fo2J:l;xg>\? ەۡ?Өx;Z`Iu$cko Q7jgTKӞ?ͧZ\s! `jP?sːd@賆%" Mg };Jd'R?B`EI܂5fI tz;GnkVBɫRŪ؄}{W7tvGٛV{3 L\ 1#@Lnm/'{iDQlRU0\ c$gT6'J6ƒEY{c6fClho=U&pHb.Rϼ<%BĎHUVq4Nn $r px6ȹQjErbFbD h:3q^OQ蘈%Z_iסVG[=e?bw>]Kƺ|끨+WBk\UpP}l-:p=s}9e{P^sC'f$i!' &w:f;{S/oy?PU6ٝ[Enx}ń:=&u%Z,'Nu{{MM/ƿd5!luC &Lrr*{q$eÿ| *B5⼟eb2T{VK3#P~ϡ*. Ga޷>y'F8}t BnГ)ioӏ@/QVzAˁW*΋ngn”~ 99}H޳{!y>4gun2.͹)*INTp(mށ8x?/2!ZWq ˬ3Si;tk5@QgZ>3^CK5^ F -I`MNfq%ҧ@'i)=:IQ@2 4_s9KrpH`8rD\1=,\m"-#H66\ pc rb7t<>Z ENen/KfH+}[4߫/~ۿyt}Q{l78Aov-Wzw.3Ǒ~޹'ܥo "rJ9R?Q]xc;w.RPꋡM]'qGչ [e>57" ][ pWYq3(C< l1޶^Z)r~31uwS _q i᳍Yo7[umXlo Lqܟ|O'oɷ[?vDt|h \W> r9\!r -%`M$3 ~ BH7 {#A8H$VX5 8 +8kn'%Lc"?&!7bHJP5" SY 2BX 0o jX!H㸟Lޚ} sw.)ٵYj޲K5l,* ];7TWH1xf 주ڥEǸί10 f1grH8+,_+%IZk4Kx<)d6IUp;oB!gh՞s:FDztZ+Iw[Yce5y/O L`YmȆ :(i|5y-~:>ԳmC_ϣaR,ɾzmvۑD:I{H萈{D[Lk:ty*/0ۚnzz0O(2{8UdtwSdtVcNR6џOd44;Lop_)bU^.h ψmΘԹڈ18 &k/,q1SO%ST^'&C1%v\)[uR9o>  e"P2zEiN(NR@ oW12#XP1g&Hu22vU:4c[,c!Up.Qz}b-9r"fίnp8~_rψ͈D TXh)47$F f]@ݼg!9 E#K[3 j-$&gh/#v"a.DPW⼌v< XP3YǨz*Z¬JkcEE\Ӕ #!z#6+8qۨȅ4*X 1+ k|@N@@A*Ҙ3q^Ҩ&`<DL?ED1"{D\[10tP! T#gIz\O@XɩNQ+ l: Ą3[XA`Ir,i$ \;gs.y=ٙlc\=.JcrN`* :+04؍xP'Ci4,HOz\| \<:Yjy?jCrC4` c)#;B$$:kC9 qfcBuiRd#« BZ{` " a"")sTH#NCA {(ú- *fPMRC\뺡5W}ւXYUدzm_nm-jeE\5ZS(xX7 R X̶$o_'Y9Άc">{[P Va_t}#u¯%w8O.n#?r˻*[èů>R'C1-Ƴs8]U(&Ňfx{{^9kyn6eUn Ip]fK*-/:a>Fo %⻍I+G񮶁ш2[I:slj]g4_f+^beT:W1Z?I(kk.]j"e봸Pj2V믫UQ#SR:Q=*mt:FqH7U Q:R ڠ:?3͌_t>.|s[H{Q_QW3:I럫N~4{HtǵycDq˭m6y7)d[M%;i_YX[Kw/7&obZnJMTuX=c&6֝n} p wx߬XX0ӎb-: SSAS2B#C:/"22 1Z\n-ז!)b9xazw,;:ӍSO-/&^ow3xE'FjާتdvobIc3iUI" {7ILWW)FB-\ (`K{)`r\tAd'@_I.8WfUd&V:RD,e[{c+<e;G{&O}1} X;qBl: 6 `J68?Z\N:aݡ>}.(Sރw8 o$o-߁L?sӯ-O1{qﲯ߾=ˊą7g0}Jsxg0g+7v<'wDպ:?Qz? s 0#oǿҙQfy?R3C6g6:| (թ SV(ux=?|72ؼkfjT.0ٜQfkT+rɪrv.L?YMgW*{?|gz?Dd]V-}adpoInyy0/>JEIV/7gXLٻ6rdW|]``>ffowg|me)#)}ݒ,;nY)[v:@l"lxXuJ5`jt:i\rwX;RǫM\U0HI15ChB-o1πP !ݲ{P]ue x[02_z$s%rbR(QHMB>x BDI>yY',I04p u)B9C9$99j'Ѫ֝m3aqV2lғJ#ߡ֞sG7ўi$wIKU{^˥;|UAxb`璶6hFcV,k72Z!0IN+ >.NQM"(#+ D{![*:k87"B(^e7{Dt=;8}р7%Ik \O}alE ֭A HKCoYk7'G49?iwRxmy][i OZJw_LwXNG?rg-K:J,L~FiQGؙO磸:Jhˣjtx:nZ$|Wߗ_?zoo^7_O^i:Rm~SC>k_{PijoմC|:=J닍:VXՈ%Nn2أz%mŌ5)䑦 ̢Km6CEKƒtti!Z`!:䁰_ .0:HE,s=w6hR`pS`砈͉[+5}vZ*^yx _Ǹڹdl:z鬥gXƻsEY}ebG<>ݩ ;)ig,D ]08ỻcB_oW2w~wX,](53hQRwl__~1tuԡ e}$VҽVe_vDue{Oԧꝲe;:H0pTptgl XbQJ"w*\Io9b{qVe_ꓪd-$z;;ySg# ߮ӴA >4bÕOz)OhOESwCD)e t|M 9EAzh<-{@}8hAʶvԥ?;.ݫ{߫/ä_~Iie'Иx<̼mѺh;ջH끐ސʹ )pQxk-dX6L5Ź!*4mmmnew؝1 t :'INI FP{ ͝t^<(H1ko0灻CDlY}+Y$97m3ޭDho{,}8ԍh=6aQ8AGu K tOg2QVGYBhԘS1p :1(aDd6 63Ϲ+J[˒˖f2HYGH)Ĭ"1m;5(u#p,[3.{32[F2֋']K룾s79ǬEQaAVKlRf2*(ИH gV]qIiN\;G ޢS[H!) UW^պs g=u[/R/+ +w|dk6g*Q0+5iz)|yU%:3a B.xʉ՜?[%3At,i?7e?3K<qB%.^nW)J>֦˪֝OI&ΦO%kj)8#Qk|,igl.P"nҋ |%6r K;yv}phX#=3m+e:G\s&Leפ9kdNrg8Mkߓ.}gjc@ǞnͥrMl?@zS-um|L W@ƴp]0γ&@:5HWg.P0;`֞dL@1Ʋ\JD2lʊ)wNGx<4هd<{ ΄}E%kV`YR[(IM H#p96!: 8\ .Vg˧D݉<"Z$FH%JfcmZݹy%x$X-ufXTfme, * /<,ds14}i\~Wtxa3h'Jyg#P78"8d[|Vfa(ʞOR)dK&nG* ptY4v ZݹYb$m"͎Jm,a6ӶBpɺBY!ANLYIS*aNVe A+^V@2!CEGdMB@a&H2.Hbj46k,[~sVHjQUjD6 [r4Җst <&P8 U*KIˣzMVrƴҍQ.ke2 r3!h5jn$I`$-@(PY"V'H:\8YmV.rQrqf`RfV2MrK #$SH꣤2&d&̞ !EV"DXEGq 9 ޏc~:$}G'mGI6 8oڊ4&e% C4.V=;mnR־ޯ+Xcq& k-I*3;!L"sQkCڥ$ "\^NA0 񉀀!%G+dVI,@¤6 $)lV 1Uc&!A\/;SW~NH>!gz\ -JܜgG%˲y'wm O !xpl$/]`w O+jmegC"GЀ%q]}TuUuT:űW ܀ēI5I o:np49*zK<1o$ViE< ă.9͡Pi"9R1 >ӉIqϖV[=As%Ǔz`%85c#$RGM( 0q̱~`f]T-ZxvS !)Cwu/kˮo&pts";җ≁Bu>/]ys +jeN04%<&teT%d},EX&4M+ϙ%K8c<UaE yb|Pф6K]x4uh }v2JQ>M/\{lAd_7Y /qlKaAAWeWgnR8]8|B@fSq4ɛ."fHOKHT+kz{U{ 9k+uB Ĥ|L+%!Qrn ?F@F(5@Gc Ū_tR4 f֐5/N7#eb&&M͆gay\-eXW"tSQuE)jzVXe Vܬ˭FMQ(^Vm yB3<*Zc *~]3}s [30">{Yf(N BwU楹x2|_JRwsΎ/>?R@V}s('p3Uii:?nxs8 tVM9$hg%TZ͆EV4Wi*İh$=.`r|SFЧ051_ |5-Yш2['I99{leUHdes=)UOakm#Դ2\Q skq]V(e[mR@Uj|_ꖳ0%eigo [jlٔWj3R0jI9U~L2}f Ctlۍjvzi.WW%In/eH53P+u_Εv(VstF*I[|`km-T{yq ͘1mjz͚ߺ{n}sDD[w|R*gps>(D!T:0h{et%ֈ49UDJ8šmxLŗfQEbTЄPF`bHEAQF=Xf!F -@˵eH ;t# t,վ@wt6:l{z˪} q6 qnHT!A;(=x29.ƙ2E:1PA 3dyRSNA5`ѠC:0bp%-#MH4IYAk`,np`΃dhMI2X @Fh6;L={h065Iv87tlP^rD)3ln%9 ,7 7ƾT4tuTdt4 b.Rϼ_7ϯ' b8'!|q?;Ɩ9`?^Ochwg jm-]5Cڛ/>RF Guf:Yd]D0i'Zm{V0#Ly+pTj_A~,yďqgce縬T8e~/cNxn`EOnd"Pf W*Hgbt`ZU+]~Q\_x{ׯϷOק_=Da/ eRHcmIe'ށ[M7ou47m*A\uz7hw:rK/ mWLAN,=ֻwfPjZ,xIɈ8z+ E*P RKZ棖.Y P`(}nT%^=u:` Iڃ%L*‘,ij DD`lF`lMl #lƶΞ5ʭqMKNqM+C ӣ0 U1LWI3Ky@sG~ rI$%=!g\+As&.~YME=Hϖt< bpDSq NlP{Sȹl9U';7AB z 5ٮyB0aBpW6aYp/H*<IP@M?5LPV{Oris"sQ^[Ƣc\gB1?[|ߜnvrCga3 F )Ыҝ䐅2f~VOPaݛώx LqY̙C:<Ncʠve /,oȑR+R 6>ޥ.HJ m~3iQ:Be 6mZ;\"_O6!`kLB1F: lp6HF/g)OSaTkn/H,`*W4' ,[6 r)FTWs(D~t.woQ ӸY_sp5owm!ۺGgĨ{K΁Kj0(T[9&`ypsAJ:%gq8e_S)G(b͑R7דFH+8,Sx&T!e#Qk12n9ebJQ%BSvƘ"h@QAsgтA4Π *:d:ȹWY_Iln=Y:9 m)Jr]q,ɲLsryi/D#kTn3`9b+CfXr-c~545_5:'42ӫѯw?wY/1;H&6ؘIHTDLrmSIbIeA0z_GZ-7-GO]ON .Aw2H}, \9KXҘLRD撳!(^}ZT )`8(_BN*vM-zKrgOvv\ÝOX6KP'.G9qqʈcD4sI""ᶴH^ +2K'戨h!yG nl vьG@w%4'2~{f=[ v)GkH6tY]p5m!P2ٰb6JKK-)ྙY{'K@";RB^zDpR F>`)5u9rsv?zy|:Ё4՛6n l KmI+7[7 K3jOsCP X( %CO~s>P7og_~CpupE/ ĘUMß%%1EBlV8Ak^{ /Md_zEN@|`-؇#@z9Ը-Xl1E"ᆨg!Hr( /vAq>;Q%c4C09OdzN ΍8zN3A C~_Jd*W Ò4/Hk*8a*dDŽ2A* BzĞR/@gۢ3WR`j2O`%,NP!{qGn}x?Ш< ׷in/1[=Ӌ߻\8%2? >Bu1L|e|U̺)\ kR?RX"pK_^h]K?px9̷͟-g4JtAy>sH\iϣi@BNϣsS $( t/|gHD#-ӧY/S5V []ƝJ+L.S_h {RJ/f1#;?:228[+D/{ۥ"jѳ|m&o{d_Z);E#XGU~1Гw^]O7uNߺd[{4qq}u&=^O?Kqq?X[钎yUꗋQ$Ǐ L? u+x.W.hw(EWѧ.;[G}7M&] P}yWk)n?ëѶ|wx:h{i?uK !΅;iq+_9{`DNtM-zQI>Ӵ۬z3FL1HřXYҡOօuSL4{M\璦੧)vHS)24? orFOk[D jR` tf0gb`2X!d-=Q: U/C}J\'iDEe @1UBKQ9S}aw%rE"CiƴYsZlBzN!g{hr6$=@^ֳc`Gbi'FfYEO`\L)y:͜OI/)J3 Ϗ;ii߯!̐YfkfnH4.noghfy֑Y6} X5$ HHrH7`!aߐoH7$ t4CG;E|B\O$|2 Qy  #$fOE cl,;U2ˆ(:!p/O3O&)X0rm=& w洐;GH;y#^wvdFhYwhwFWwȫǣ Q9 H6Bs@UsB;9/|Vmv,-@VU7^͚}Y`痢/ٮ1tzN b`Gbڗ|fn"ٲ '(8X2sp9@xGN󀄛("E,YҌ,O")5*0VGdْ:lͮB*ɨ$ݫ4j~.yj^*y>T_C>2i[9WT<.ϧE{| ޞ[_!jg֡KmGΓ}/ܢF&\ BieqȜrW't\rWWb 9su)]~:CM0. t; "*Vϥ~{0^ %py A”3ZdNYArBPKmj1[3'>ORgʲ HFѾ)LgIL:d8ˀ= LR` lc0Y>U'Oўurƶg4 T(9udLJ)m0nf/]FEHfG Hɻ@L0Ui|ج9;]>;F;'UB`TƳ{M@d7a8ioqn7UIqbz$ Fh0i]+oi\4ɠ4*|; =ׇ=ׇ===hQ47XX@NNCbVDE`beM$7K+tJ@ zlB3a n/gw<vo9A8#A`FP6*Qk 7.T[)pT,:ޘiBΖ "0˵ebAIϴOa8ې/D6\K"|Z1IOVXTECO>!"iلtL L.Xb },*+l,T>)](d=l|Űx,9 Z>E܋ P0+(vuj ctc@7~C+br̦K9X Jt*DK^,0ܱ"Q2@t>u^mt 5AgsTG:*)Ӛ.0QQɂBs֨߃Z3HPg_ߓV~g#z/+tif}V|ؖ;9O3hoD)0a^.s-%@WD`1yQV(Krq6 tt;JXʁ M)HvsF9bL?q4'[ xZy!|!#26kv4f[mjl ` j WNVds={c$ΕK^>9lAl3~qAs56Ȫ4 $ A*Yͦ҆4$#]2"%£Z5gD1Oj7oj`iƣӶ^Bpɺ*@!AI.)ԩL$Ex#|(#u+bȄ *5)e6jdb*Tg[25kv{ؑY5b͏ZDllqE49$^v%e=G7ӏ"&h{Uh_byXO)6cq㌠+ RLH!Z:kaT1`-@b{:O%W.N tl.vQvq3E0QR\In09.d裢2&d&*^Lrb͎Smp&Y\(9MNpq-i=uk"چ VTPzC93NKޕ6r$R!11}7yJ(CRnY/xHElȨ̈/"#6;1"ƍa4F1%s܆ӹgp7Ъp(Gfg,ŢtJaONUGn>j]f۫m>3-vi^-뚞>Bܑ4 c+#;B$$:kC9 qfcBwi XYX$˫:`4{-P!bxT8[Di ]ƘV2 Xo^s4'׉\Ad;U/l O$Q*+{l7:rD͸ezjRT)xCPgx1xxMn'wP:(hT4J]\?fp|Ɠ^ 91IhRS .Nl:X!Zr")bFPIlG6/bGa㒵z"350ܟvLFMF$dȍ~@56 (QS DqBʪ5.B~:b~t'RP'lA..]݆gYa\,݇"@H =#TDk]JQs& b+fJ/^q5DQ'R %{ czD)u +xxZؐXs[x$o^'KgM{}[&P 쇯0{3d:|[Hk\\׽ہ_տ)hٺ'.?:J+SW15?K<C'd`F;|dYCp fV2k%}C4ivK.`~}9O#40{߶̗ ?of=myш2'I:sh{XYw9^LdEs=-եˆ) Gh"ϯuAvT&Bq*0ՒKIJ `J iw {jl/J tjR]:Ս?3͌i\fz6?]w:[Zg؉^UrCKB^$M|0N^ %ґW:6pT_.N &[h.9ܛ0߾MڪYj]6~7(vR*ؾz 촻7;hc;vbQXo3/8\l* .,X~70P_ߡu̥/ݪl~SnlI~wߑ\?5וB$O߿p2de aCP컟R`8<].^p|q+<5~e`b]gQଈG/Xؙy(U\|Y{PzXF ipoVM QlX}72X PY{]eU|6d%:vHa[wPOT}7+33Ӈ`R+ rYFYA /l5zHcq0 k<*xLq%bRs:zXL߹pPcB @aNM Y)EY4C`~!S(WYF=, T\P]Q,Z{Z6땫 A> EݥD[иgL(,؂(" ? ۾45=blUg n=z'.*eM;s[=<2PR!ErTHKK&ˣ}+ZyQamZ#4 }2%Q,F3Dt=oW8!FDf|O3<2{]E|jmʶkmZiֳ |?uXXm/%Gjq~ u9jb&:ԩB3\|av. %. +a+Ux]8EC}odƓ$p`"E'ܼG]ܜC=xyKy#=| ]hvxJx4tѓ0S Ak,TxS8^W"vQlqkNC`=.bHo:p[35pcotT aXʘy-WҠ/AP,2) 1KXT2_JqiRCHV $"$8`z72$"MyLY#" +JC kMscp^BL:Rja*ȵ1OM0&51q;W ‡ |;i ͞zJh 9Drl >Cz Y@T8*5X@Uv Ib\w\`:?߄9p4l0MdShl?er,[~j |ԗsn00h8x_hCO-Χp~YL\ykHYQ)͋m)E‵d0v!\0@[]9XgGH(򹖜N)bxt}PϜ/jE:XjE:&jXJ;Hu$H% `{r,s2J@m`)*11smC-qd|֝{n`snzua0RYpFrÙϥ7> ,%gAR'4EEUG`nN)vM)v6f/^v#Y/Fh5u>7ӛNLwRju:lIUHHi<,J<3<6RRDP0o91^ +;_0Xb'Wp8}YOYǥ+iadis2D>B+S{G6ke-e` 0GM,+&9NxQ 9^oI\O\*RtlN޹0Gqa,pR%=ہ%^E?6ғHyږK_cL{AFSgC2DwESi|@N^Y4@jg@01{&an?~uB{~lX_g UA- d,oFפw~ǂe&36(FlY4aqO杺B!#G4ʙgm7k!TuDq'LR k9&PYffu.>z'?ؤJt ųl?}cԒG! RX$X M9ً9"jPR08SQzc>k&8z4?թ^7֜lKzZGPWu{Wą؜}P*9ϔ49٭'~֟L7 Ltl v 0}e3MwLC=)vIC#-ӓM'1x&gJag=GG5xeGG?Y}| y /GG,-;?/~)Qc|ʰ'_~Q]T{neN \i&68;MNT]Y=_a<]tGʂзOB痮kg\tg?/mMHanU١_{_2>,^l=Mߙ^}E^coTtMꛞI;=]VZۜ{7ej<7տwNYjkkG*lH䡱L&H]:6cl6T6|jM>0-d:WQkIX x*rX#dka#UFH'Gs3z`-r-8ۋ/?Z-#"DɥM\&a NkA`< "*~*;76?%7 fG%OE+0k(v=yw 0mٛ}C1 pa[jfV`cUk҅>`訌` r D暠kP"DB/ήW3]}X:g/ǷH/xD(UL^-I: .AB e,85}?WWzi'_ƫ}m :}V FeH8M麲d5!J{zߦInu}nI׫ /ÎB[}q/Yw5 `;>~Af9߬ \ZJҌU@xi[v[GLPO2f_ݵ}A ?hg,0i58s5BcZ8K.XYe J,1dK`f;iOiyJc`,K˥Dt),ggSVLs::RZީ(5w#AZea ;ːqJr&s$j6Ajlٿg>$_~Bҷ2  %&BIfZXajˑ1]6CU'Ӂk"l%}V|JDN!!*c9 ꬨ`3&XRtȏPRKON7W~*׌Z}}c@'mK;o`Nmd)U6!&Ę+yv㠓,[bLjfK59+om ;I.H`,w Uc974-fUj.u<½\ ^dܱ=;ٺ o;`= ?Oa3h'Jyg#P7dxi1%pɨ y@Vfi(ƞOEmJmh!d!*#fT5v ӊy(Zw쪵ֆ>"؍Me[K9.BL"IS*aNVe A+n^V@2!CdEGtMB@a&h#bj4WևyXF"(CшcW*kDuԈG [r4Җ}t <&P8 U*K#b=Xid]ֆd@gB jH1%-H@%v%Zl|ɢ!:qɮzT֋zqc&(%H5"-LF)$Q5&d&̞Q/C/>np*!n(7Elr5, "Lя`գtjy^bc%5@lF`VQl4:;hc jþ_OsG$Ĵ5>6WчRדm쫒񋟘 LW %!ĝ3N mb:2[MO }0-a ϫ)&MwMY{RLkZtO¹ jxW{%yN0a{&O;fmuoͬ c< 9IT<&2B(oP*3`*!gHm H=p9Ƌ݅@Ggb3))G\& ()yT);.|^kI?y0T C2X!xmRN(3K3AjT#gKrԯ;+BIQ;mӱ;oRv״.onɒSL=KU\˓"ƬX" oAKHEx(. t`iQ 휤NQM*(#+ D<5$ygbzAܧcPqVxa KXs VC0J3 h=#U8cFz%GͮmP!L`%G ۟jiF S9i3 d*8!9a:jhў*)ŨY2 [!9S9ޕX,[#)B05}bI *5vgsǥ|DJ0^"xK\q'N{m8F$sX\Tar#⃙9.řIRKޏJHwɓ 5)[2s θd䊝-I≟L2 nN&SR'8p =g1 K%e"rAcnhB]+\i!sv9;{C|zN/w[E2%ٯjWn{Q50eܖU4=X>$VBs=f?[w_gg~hO|s==x !F)\ags+ܮ8Ն?]Om~ƺF9GFtYe7$ VG Y,+h8YLvet4 yCvN+X:)2"~4_W XV9D碴V S٬JqAVyLe`q.6v%嬵1 6LڭGYKYПI:>uVOp~AY=/h?Eֹ4ِ'~gVqNɉ*)>Aq5R@sgխtz-K+>K?G?Y}g(8tNpAZ9^x+ypBs'W$(Hud <39;*ėRӡЅmDqZ%Eyz]DFK[E=:=UVyͩ[txx96á2< B *5Dgj{ȑ_n1H۝;/w|=%$'9bw˶lKVlʖ`E!U.f͍Q0l۠6nGL ʂV1HWFdfdęLB܅CV /lNHKXЋވ>lIBkUb.']/g}vr)7 K|@-Rf2rjWS$It0`$\Iƹq$8LVΒ>`N*hhPj'gf#^ l:]udz>;5x1lTqRW|!e4]>W9gr&F6GʑUN%hM8(}U.0ҌyE)c:$1̀.$],泴R% AT坦#OZeQ/~Kr[A\fbHtVymЕyg5sT}JWgϛm~}~޾zi殍h|R{еpmzdrxwtC/9'**@,S5 dC%)ivSީÕ'F-?ݼsyR>}{rZ[O!8U\>AP֤it'].>WǶ?ydr~K|}y *h~vzqAr~>ʍ7ؓ^||%o,V΄wt 2e2q @rRFs& l@ Bz=;.*+B9bQL/- t@SURrY#Y䱄./ 8B(K<7{! E&#]-)$LM_Iɧ5L&3-ϗL?MZuηֻSe\/ jRNxBIrQXIjͳ5y+XRR2.0x ^ LӴGwE8x6sؕ-{ZBERvohḐT6mRӉm> tTN&q f% I8Wwn:3uJ, UN(#9i;0{|[37Mn ?Pz snF~u|lM rƴH.s۠uғ.ȁ}pb=g"_Vkc4'p!xƲD.eJ6fb̊)VK Rb}Z<_"c+Ò`hQ(H@z+Ii%'"F@jLoe8FG]7Q0csM# ېTFI+^~dN!AbT*zcډj@Mdzֽhi(Vݧov޴p6>h9@ȢZDnxS 9;@ɳ4{mPtՇ `=lfɺ5coĝ$=)`@eRy]38Z:L U02*հZx,X(,)i^\ffpȲ/Ч*ǧ+Gl)A)< rhK%e3ΐ q*E-FɠǔPff(bEBRdvīKJ2 kSeÈxCb jWEm0`7xe5mU6-D1;ϔ)C;1R阳)GT;ÝJ릗6.xTH!3@*:dkbbA$0h"QtT#Yd\Dly[Q9+/X<UeDT"3#Gms0>S\Zf_y@U*I呱hSh3 6N $Y. BIs#@JZ4dׄʈX͜-OjO8_$٧j䱸h*pqM=0^)ACY3LI0Yҍ'GIeL&̎b]aޱ/xh;ybU/7x?AF#a;:?>1&m4d8o4&f%N1oQӣxٗ)g80сA}'OW:9(gt`,xAU9L9RUհH/lWaj'}r|>r|O]0egq.N崶~z衬#řx$#C&")3;!"S2Xe |NZR:v/Nm6 ϸ"#}{PPI0@JQIm5+h% 6TCĘ*c&>G ~'[$-A%6WYcQyb,d&5KH \*1 A( JDGl:3-N=kg@׆<aʣeXC/N"q!!XERR 6w}䩂<0.V 't8^9~3g'O>2:/w2)FI62!/iI%I\p9jza}k*X3f5C!CU;HKR //t OYZvk)" ׄ\:ra-{YD7KB&h6 00C]jL-]SwpZ)>O]n&A >X_ع/f4|)[o㣳I\^n`-nt7'팸.pkS}g\_VsoIXڡմ͡ه}k_%Q w4 b/z.{Z>.Njѻ|Ԁ5fwK ]Zb,j#_?<3R Fx?dS ֗lz[ ޅ8]5Y0H3'6}*o'nFf]ϭ?|V\S+2}?5Z]WK^Q x;klzׯ xv M.!Xj]{MҤr{Ԫ^Ӧ]"%ΆP \oOwޅf3|ߩ4u9ktmyק:U_zSyaH]/Ӧ̕K.rm~7rrz+nhpJ^t`Q/zo~v*6?Pj=\N?hp ؤ,ĚK=ܽ8|qU ޾e]7cceÍ *]йP2=vf&.:KͰQ9@Xh 'DƚhZ&˓f3f:ZO%&Ti7<WjYaNX M9M9{BR% }:Wz`<-./wq~½Ǫ}>-6ξ%!IÙ:3UFܝ_,Fћ0 ݒq޴?V7]o_LMqJ3Og_dع?Mi\/giZxd[kdfgeI񡹛 g픽bq};.]h҅B}r-gоeɓ>=}v&z~wK߷DcU6tƠ,.gy?[DL7{cjzjpT.F=7|@W!rSǧQ;ל{5Rf MBm` A5;/CaHiؐ2Zl'QiKh &%wLp h:L{L&Ns&TrXWrl섍K6 OG ՙ{!Ut֓5 uqJ̵c2gJ-.J}|]_,Y]s%X>ګCڏ8=A9|3f?UF4I+$ s\G\4hIe%-!xa4AbcXrX9tsg:taz Y +յo6nSuQ.Eҩ7>sh=^ :uw*OiUYv:IIfΙ``nK+;h[kVrwh%kd4J2 RFctI`bBnpiNwɺ4A*$kK?{Ƒe /I0i؞MAH;{jQlElҖuֽ羪o1Fk0Npolg<2#-; I`7߫>|׍od*Vp~6^Cjڇ K-'ꤲ]ο);[%]a'CH' b>DjԆFXYXfNCR2,U$!`m*nd)r %Ahô:<[!5R HPv B폎c-PZiBAYͧVN^KEHL: #RB &`aa"%)Tf&%Nƒ/9cNY"9Ci)oI𩦚%rTP'dEYy?F -4Q"O9[-oʨ_O9֏[o/WȤ 1( &~guƥ| pތ?t/H0~C4W%#:u44M@y+>M><)9M )Օ:qˊB|X@XXk,YlK%IqW_Gl؅ 膩*Hꯋ)Nj˃0x%ꗿ|^_>}/_Qf^=٫?<- 2OꬍJw";pm{SEoCc܈CK笗 0nCzr˸Xm u859=Kq_ͩN[V'_xH$Z xbq6&1[JqXKUϟǘu8&EoV1k#"5ID QMrb+0Re.CaQ5M&jQ}6X?ֆ.#״*kښt״ïzi0 8}S/lT֘?{عdCA(T㢲FJ$b( |x>f9Wq:c@Z&+ r Z*^F> /G얽.2NYJ=KnOA'"t n :l#ذ6 J/jyZ=BMx{s~o>>7tUED67لXo^|<ŴLT5l-9j$3A%*ȱwP=tPb:=%BB pUGWRW_ \YW`KO@%TPԱURlJw\zD)dp@Tٕy~>;2,SOgu ?lЊZX A]9XHA$=wѓ"+c4 9UQIk2 .GlWӋe.C^_ ʏoQxqm%i mP`(;"hqۓri /B_d8`k9_ ο b,Ox>;`o/Pi*жʶk}RnTuR3*oRƻT)<ֲ’R4A9n3xtP(cHg^vBz.˜V\@l奷Uf ToQFl2JB/*#9iz`pvTVP~BZN>)[3^PWZמѝkObz̙4[]{Q|b,~>cՕ7ze`ՅUKknSO獽z%QP t@yYaoқIp6/8fI=]S;j[OxxVƴVY=AR B*:@HUP[.fk]FowRr>n/..Q;λ)ksu?Vk N֌kz4XCw nId Xөr=eVcOP*ʿ_dR*), 3K`v[ |l4xOj)` s2Y&ZlwWe O/FQ28tLj aeiѴqTsD*1WDc`Bek"Ve`]:p#*IѰ5չ7)]:̫7?fo.;& sD7W-$n=':Fĩi]q|.ܴҧiчauJvٺeChd} ʷowZN% U2s?JVi!>cjz["7o>r2J9e3)'s7gC7$M$#Ktpwz#]ѓhD.(~/gUGn. blE f7q YS'37~~mz|ZTO!wiQ!3qxpm?d> Uu_1)@ U9]8W-ӮS&jdgq wYYrwX4rʝjXw&:S!ؓ)9\M쩔 ORXrҌ @L4) k,Qk veTZPŒP}RSr)AB!jSX%cbX*D ӈVgs.{g+j5Z.n+e.U,,eq$-"2d*'%V&*\"$[ hq,#!* e2f42( pXãTH26Q:jU> {$B\_b˔ihn=#H52k6!&!9FX* w2ƇVIbF`vB+Y@xCVA $:#@>wӷi~1r!]21ڸ@Y3AQ Rk6o9@.)ef+ hU$bH) F6eKcHuRHQk`ள*YJ!;ą3ì3}^3 qmZi+$w }^b!p#H F2Qhg7f6Pu1ZArX 蕳`B#DgK4n[ La/=£m)@>ö8xDR,v ^MhoE /hW rictmV&ܪۈ".khP<.V@ڬQ #F$<hi%4_U1fIٳY1zG"<(tw BXmEktf Z YFNx2H*@,$DPV" * LH0p 2SP'+1'Bd)xx#\lcBgbƤX Hy.%PF W .R^xW!P(\*:v w8liVJ6:"kZRV6Ԇ)i$:/`Lɗ;X5EABiD Ę:A/CKq]C#]_r{:$/ XETYMnmC/o p>zH,xHTQ&.y94d4yY84Ӣ/dƿv~w_nEFK'NU>ѩ`hqmaD2*ѥUe6"8j$!*w!bS paG^!ERhh裎FpXUh6R;hAGo 7'QЖ?j+ 5V]lFc_;$C8! Ф ȶhm_,FsTH i+UhM2 ~M@rw7 ޴"0aXJU>hhgSdlH'ZxDk'0= Vwu`E)TD@2[ P[:p"2;GMw;YC ; yd50E CZOCD*3w D7Wg(CT̤wj-@GアUРE^)t9b0WACӿ#ٕjjaRj V)k7Jc꫈ o!RN+3f2ՉpS"GxˡGa}[h|A\^!]jItG嚪Ht0):T3M1J{@π':H=b=z {qh<"? k;]!O B.ܛmc?$o ZyjU 'WմBa E r ՝ȣ(ɍc@zFzL^BSclKM:cV]5r Ĝ6 61X۠ GuQQAj|hͤrdn:d VMilHuȟk[WQf;7)>Hhͷ]j-*~ ȴ@*1 &!mZj0^=2.1%9>d4{5" B&mWܢ&n ݋h( LCFGmU[Ս ӥ)Dt0 FZE`"$jBFR!]H[ cޑ M78Z]h!H\BL)@Cě^ul0^&tM*Х͊,ƨ#ÖHN/^:AIj¦U]WH;\9/w~kڛG,[BŦi/ޞT紣_]fbW[+3}:klP!18]j'ݾ:}\xOfM 7Z]ѓW%ZAc\r4Xq *l\3>H\pEj*Wt`էqr\}WF3-ƩG Z4WIpUPW$86ܤ\ZKWWf+\ڠK^p5C\YpESb+\A' +LRHG j*>":f+f+r|+3\JLVp5C\~XޞS[5TZnh _^^AǢ-6k4T풷[ڳzs=k~5?P[y2ܭocʕuM(E[^]]/F%؛PƟMVCqޯ~W?}n667M kxW}buS׫tfjW9zjR22JvZT!:u^+RU>J֛q5Npzj4(AMq*Ma<WYpUl%ກ͚ `w&D|5ɟ謊oί/[)Ccy]G[C>^/_ZtmW^:kƞφˇ.?ݷwc,Tװ&"NHcXJxl7{O}$hcM.>)7cy蠻kS$?ܯ;<]s8cۏV HM*tx!:m_7'ڜ, uxVwyqFO(9; h;T*_+xo/:JنW ء_`ճ::__wڱMO[>Hg>["ޔ+Rjd W6[DW$7.Zm #]W.{e#\h&6Z=.q*\W>'2#\A6|Ak \ZK WAӵ$7vͦY>0ܞȫyw[ۏVJvoϫ7OA`yo \|tᝥnIW+kmrU\ʨms15RǨw%_U]rʫygǙ:vVl' $4յ#Uf/!>p 8W$h."6+Rj֩` ]\gԆqE* ;EGp4r6z W$86ܠX:HO;Mf+$i]ZKt9n+ \mE#e˻NM{6m6dTTrɷ.F Hnd\T-|6Hplp]xZ֣pEj]*W2>9*5NOtEr+R_ AYWۧ뱸l'sM'Po\qǕ>Q=)8O|8fqj4'TYZ\}nksΌpRM HTST p`"\AnR H)We qeC *2 Hn\pȳJt^p5C\m]A~r'cZU"2w5K\b2p"pMGWP\WA;;.l?P(& „[JZQRe_2F1Xc$׳Aj)=(ʤdAsAaLxWl=F6SnPJ: W)9k#\AS H\pEj:SdAsWu"v1쿀ڬ4/W7'*k1JppS{yZ\S&RUa/\V.Nd\\&]:2+/!LpIɵ HOTFq唱92fWPSjgW$wj(QqE*K;Kp$ *i{>ɝqjsSPiDWĕ!. *Wܛ3*~!sdW$ q_hT%=Tr NsU!8i&#> Rr b?['>*:&W%\T%,* _hg% 4KRk_$\ޞ(l8J<'wjK8n]Tv++ܪF W$8h6f."ɖ+RWĕ~KuQ?4͚ ` _Oÿoίo GkMwz\|j}{S|CyO6ϯo!+ lX|||hb}K?A=B=p k1,_%o6śn{'>ŦGld| oή}L;<]s8cۏV HM*tx!:m_B5 ˳./ζ %\yFo@gkڄ_78cw/ճ::_7r{ۯ;G}X]-^$-}~?cB*rQsrlb5cRlb5R[~jI*Ki[b'լV;F"9:cz_:HeLN"  8kW$ײ'>+RY[p$6>Ŧ qrs+dT,!"[> WKπKQ)lU E,&TVB6n0H%(Zk]A!YY<#\A}9cWq-WP锬WɅTjggʍf6 EeH *{_գɯ'7Y!ڬM"\ޝ~As`3yډwRkD Tnv#pW[:X #\W\ZkK,mIpeBp3 N +WֆqE*\W*N Hn\pYqE*\Wm&F`-\'}t\,q壷 EW$7Zr"Yn+[}F *徥gj{۸_Ebn_vKkZىÙ4%YǶLҐCCl]%U.cA>oӹ|X$i?={W/+Oc'I~j$R=dJpkv:TJHK:v)V=\}pP,@`MUw`?LZu^JI/~\eWS+E>{6=?C# zizYWäGG?xW6=֒S$&'W .{=Bc$=\}pE'W 0d ĕd J;\QK+< *+UVc+} OW_\q$X:v%T*IˎLRr՗W߯7@rAB R=*L/V䯪Bژܓ =k7X0qԗ<@q0CNxăZg[oyQ۾zi&PyEj\%9"ͨN*lvtRM5:M%`hW2w˔\1ʧ烁0#-+0gr=j FeJh>.۩]bo2Eőp}HeC`Jlɠd D(aUɼ7ڦP#Usԇ+20>k24.v@BlsY04\`D$| ="AI}6D;m g&A8;(VXqT#O`'JvJd =!"}"k/%%\2N,XB5 LI`"E~a'L72~cz=Yߍ|D$v^% bq,`ꉄ-y8;o'aoCߝAS<foGz0x6fV~7/dH*z2ՊG:,sc XF>jR/#׹glUwm+ uXz{UA/[0S \(wp)BSϘ8P=t"i}ZIV2Uehbdz>@-} SYGQ9\қx]*% c8@di|a&! >9PeoC%W{Ŷv6}6?L?LߢfIOV eޡȷ8![X|0{LjA P|흧Nvy}?u"f Kݪ蠃^tl= 6c Q=-~*9_S 3dԨ@6~eN#$V@"9cRJGjfTG{jڽ.6,I˲?S]ȧossu%hd~z{-Ki3Y"UЗV>rđXk4gh6{$!D!e!x0k5f,`ⵌ&6MA+%Ĺ 4y~>u;/|4,yI4<,}`MΰPg3c^4i2D 30H``o f541tt`_9V^Yc责&H ]QQqA]`:$S{+%Rڙ8wR"`iyv2d?hkeam|9-pǫf6moSr'-;vM=)%csZ.T{SYTas>a- Vx8<BPT 9筂GSdJF(x ܉brKTx93@kdL;vb! iOKRWoYi܏ɖ7_Gx<0TH[h)47$F fC,n3uhֱ^Ƞg 'H QF 904cg ;ӏCwGwiR#=* Nh0Xx,xؙv  lyʛr.1X 4}`n9|w_"b/gސE\2ޏk:+$oUu<3)a(L>rb7;=~N#,OXLP$p %<&贖eTT"ڱ=?U; k:G| pu7g֌>(wwJKoɥR XR%*Ϡ{M//;٦7&ՈJn-J`E퍃X4U=4uLU~hj{yi\9ےo6d 1ʙH__ZV}C4Wyv3^{Yn_`uZoǵ2-a4 $]9 Lqyٟq)'T&_y-FLg_H吊w_=Nk|V%[_;4m)ROWz}+U%U0*‡5ޖ\iĺ"`JSEBbۏ9mlhUtjPeo,yϷ@W6sJ&$i5g>˲&aPCt􇻙jIm<՘+C.k&Fҿq+~Q;kE>̚Ru/]7w-Tq߭T)Ng&ah[qOwrm~\}~5g8G/kܶ3!ڤ'8OW҅̊u-&l8[RAM ߯6)&.SjɈ-& $;N账]̡sDo/xV u#J NXta{et%ֈ49UDJ8!r?ɔ}ydIMh'/x_*h:ֆA-HŐAQF=!F -2$V݀^@e_͎ewgqgv ys6ٽ96iߝOdОl u'{r^ q&wq $syPB2kxyg'},oOiA9~{fFR΢Q6>BaoJǣC.8`1waʽa[H Cg ɴׇlAZYh}/.,wMA?7$^]1ݤu+ILW~ 6뭉0XA׫k**Bp_-U7_Գڴ=oB^WG-ȚEhV.̯IYk9d}WY=U46g!ʴ)]P7ӫM6+נ(E_A9َ`jLB.R8r;,]`$?3*2DAbfDv8f; 9 Gahj)5T A DD D)) V#"t{i(Is(K'޵#"䟽C6^ , (lW$}+vQK~$jY$"n7du\Ks,C{4"Z9Ð4e T@Ƣ|B=c`vDS{u5LǓKߺ0@k:ŸCR햓@*1ۮkqV,T.R,BB h:WDE:snSΊc89IJuePUvhwFZ<9v6X, ĢM-#L  uZTEQ?6Qsij 1R'Npg ǹ*D&ƍߡO6u^8 \d_/y|3%}_֊3?NG>ܓ=]4*.R#5ާG>)۲,+o)nWc`&$hLvJmh0y$I% D9tQ @vvDH/:~07*6d8ZĖ i*@KІiuB1nhJ5JD|0Kju("\J0Fq 7*lLʽOn boG=.{g\T?s Ԯ河1{DW%=b<>>EJ].')3Лȫ5g05h daPN59;׊\3ܜ'ͳOHuQ|Se 2un݌g~zb:h>x 6#%sbKLE=.vQV2Ti^^@9A, Bz'!7t kFw,QC, |N>,7/_'ڧkXb8:LF|1>F1u,dK}?8k9eE^>nf*xgKꃁ1;ɡ 0Jh8{=߼bvEt}e 9z~ _z:||ח߿y{_p]&vuF’JoG|eEg]c{t-Mn&|~䠃!~q?V_J7!q5vr9 PRZ@-ւyESp:@D3ȩxs"$}PoLk T,pQXx!t O7l~>́t&)Dyf9Oy-Q#F%?s m$%wEk=7E;Ɖh:P <&vذ6 HJT`(4siuhhSׇ;#g7ND ]htLْw\}>]/N軸&OMdUq1 1v1gp^xN"7I%XəysӪBHE@)x&>! ȳ>tciCDx#qwC.P^K=.Kl$h.j7 adTQ"g\+ʻQ\ordͱ~1 T:`|m3.8t~'҂m,Y}Sj;.tŏ(C6Y{E`q9hD׭Rmhkպٞ!-l}작V}Rp‹b2.}AKήH,Bp (1 dL} @Y~k~XuGYKqITX0"_)R5섥RKbVVC22k2a[Wϭt/ّofBlY`ˢ_mzz7%6xR͞nw,Ƽϣ_E!;^. o/F%Ҝ5gß57QSKa&Nhsߓ~/{*S}r;d[f+|6F]U':PD> ,,*(im'[*ʻkp>T^mUQꭅ"O<ܵ;Lw#rs< 6*DVN p|q T0,SdcSF,7Jp U6B0 >ެIqv~t>\&6@{2a4aԞ&:ST(&te:ny7썡 ne|}{Iؤ1,cVǵ^5돹Iu Bq(UZ^._Dl\-DV1ݬRwG,{^^f5O~L 9{ڻw&Q9v{mvMnmӒ/~g7(D7'sR1}$͖_]^7Ȩ+iY%q|9{]\|px{Փ^|ϟ @(?t=EWzyUzOuV7hFԏ8[OQQSa"y!mq.$bí`0 W/`¹) #TOlᤳIH`NGs*KFO|[jVped c]>.š \ߢB^)W*I'YMB '4Ǐ~Mpts9h^:oLG(r 8}uB-3=J4N&&RWhezȪdBVșR>yV{ßCdWӛB"T|A4]u>{=ii5gߩj7"K(zJ#Q\4zA1j}k7S(me|jW IWTzpQCբnb<ZqrVWL B*"Z|r;ۣgÙƘ;6Jۂ+"M+N}`[5\[9sݫGӃOȊjƆ]`W+[>9BUP.P,>1"qP!5֡F7sܱ?,"m:*]}x Ld]koǒ+`'u v`o_cH]#)Y#K8DkzΩӔLr抩˔ lhB{FcTi{^j,a,> vy9<+f@X[bF |B !!}RcfG̠ ӄÜ/ J%[`LREٔ|Mp`Rp 3DoƩ Ef@8JVJ (*錚1 J8 ,s$e^cj V :Yo/Ezeo@x-( VHݱ+e7ti)ոAQ@IBP%ejG[$V^,.zVG(@H̯*+9ROQq@X,+T! /\9$VcZWuD8M2X?/;ng1/YfVZ呜$0_e!*)_,Ka9q"W\ϰdQ,rv\p ZirªʷZ]۴#ja%cK/(q?0J6`"sJ'.[$ZʕUzT} ANj 5:/Q%(x$H&rZVȼ"a>X\8LtXeqS@^\@$6XL_ N<om -x*cHp# VϓW4+JXm))w27@F^L)ڽX|,#m60Ze UF/~kRXG'R% fcv 0]0 3,f, yAmJUΪ>zˬ Eaek,WH6ԍ>o3FN%Ta>ҹd]S~c-4ڕp|<*oWjQv>M6LY"k)nt #I`d*mtLvU܁fqqպ`ZCYkp6`09q=)'3#|AaoZwâD,޴RcF6Jtcls m堋Y.eBAt GRDYZ# Rn X 7l֫dXhlFTO -lE^H"\x>\'WL/Z-: ;b}B!DE HbQ<Tk z i,&2WhâHY>zV"5R&T?RO 58֠Yru(A I d:USEce Ϭy~ks+*!o:k/m;J* Xk0m@p AX:dС mC >bLtx/Kx>(bˍ9mِCL*+:_M/cdaVMVLA`$1D%N V LBF t^\רtEC@Tv""DoaL^Հ]zFvōboxpGP%iT{F9*hk,ݵuPBQrI&a(ڑ}n9wu[~Ay's ώ )`0g(B%PVVMJ/Q $)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@_HJ}R`c:uo@RO (`0@9F@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H *42!OJ Q\mZ}@zF E*)-}')H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R V!2u@0سFy%I %*uޒ@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H (k[q|۴֫~s7pݩ<_7i..)q1VVg4ojj/eO [v|}B InmajeNjuE^Uֻӹɺf`-,,8a WmMʼGWqF@Yէi]EDo?/l\Ng˾nYzaϞզ'>e)ΌgC[T=Z?-I=ӷgd}ӎ=ף;ѹv{{y4G3"S,q &S IVPXuwMf1{WͩNQZ ^םS_~k.X ׽6S;ߍ}t9^Xq+_&˓s{N < Ւq!`YZy3y{g{KFCTPsևO:Nf/{n˹3%G^]%`Fc"VֈVˬt1  G`}W;aMcPpܗeZVgewiϽXpz= bM.0R."Ւ{(6+H+EDYCW,6Ȭm6*r2wk*z!90 ^܊b5+^:➸+oz(^led2ׯ|fJC܁TN$J rJ˲hfo:miYA$inץҫzPW< |__jؼZܙ=󔲗':jҫQA#Ʀ*^@j θdGLa>/'q`@<9(?Z +AɪXJm\1׳lwqzQzQvU(vv~\riyӕ_wiz06E'/epwҵ_viF(ʿH!JuQsG7k-ve_Wؽx8^d]bBo9 ov2~y[wsW M^.>vr'eH17b#)}H'}ÈaL0 1鬋Q:5ϦwW]Fus}^7f7꼤Y٬Uav꩗sįuC~z}'5f{6ty47^~dvr:){-bɪJ|# hvvMZ TOmu&OH<ً|_< .܋'ų#>}YtugR ~ '?_h5o3tCk7U'.g.|lv:.8`-mkWs wu%G:&nē'I!c_8O2VՆd"hs(:+ 6ʭL<.8l˨[=Ţlxn1/q\sQm|zg6Lǻg7||5-G%y_5*վ?x+&&M*NnőNn`==`(#^!tv4xR3gѧreeHFu LLQ9ծoqu1xe 5{g랣! >;2FQOҵ$;0bcY`|u#BE;S@Wck(ڽ59Ý0p^@|/;ڕS )mfT7Q@1۹9__^o^9z)jwoC|/u[w|A3veFkv,}۴ i[^{b+&elȤLICIR\̂3Y,VxG8. (\ZęŋC3 ^̜!4 )Rqw;!i[S-F 2;.;%U!c4Ce4֬l#O80?nFi`|ZjRhFy}ٌ۞z>oF?.tvy񏯛l-ދ;ワ\8׃o5ZAhZy[Ry>}:ӻFR֣(= =xN0a]H'wbmff:,']G$YqpՏǏeVyy T>oT'ڮV H6>i2K;GFoV% ֆ e}v8MG/prtk _*P׃ʼnï>i^z6/p'&Q"sgty-Wxe࿌1ukw-6?Eݘ#SBԵ!/_8OFs%K TEY%ѽV:Er s{fwג Bn= ϛ%hշx輦{׶|m3~Sf( ?I=c^x 0;s egWDz#)vr[Œl$NH(v5X>. >Vo+pB9$0dEғkυ7S֊TF1tx瑒 Rz^kϾy@;'ed p #@cR*8zl:/ͱ3c4T: {_{zn@[.F)v"$ML;A*9m3kgUc]M)r @wNY*x6m('Wo6WϽlTo}i^Xj׌fmkޱQ߭\ZoX,|0Jf|2[̤%/GP)|f, 3&ggUȭu% $gr☜!0(hR\0Zqǣp;?{˙iSL^11K@]I g?}oTu%<8b5sj,qFm;1ǎ9+stgd#3A(r4q!"JHJiL.)ĵQ*ԉ:O (אH>"329:s gYJ{*jg#dc3nMC~lMpxk{]|nociZtBnm< 7iܹw]z1O$ `S&jPͬi8>QoiԲ\[v>yxwJdWZnׇB[|}Sg<1" Yg_,wx~䩷D7>Mv}r9G30+̭' 2 T+ t\\|fxX x.ѫ=קRaqta;zF%r[Bs]UX~Cz^r\PJl`Viy@SDaZЙXC7[wgAJ;)eh; s" c)ԒOF?fkn=D].C4&e ̝!`@v@܅99;?g.(Z˔7I'gRJjVR6;a#7&}l&Y"^Hz0zʑ`?ksvn庱gU \, 2%3lMȐE֪`!ͯi7yhL"~;J@eWNԌ%50VDӼKND 4_ghQ-L[\ 6$Fo <$4I(H*)e2!%M@(իvjZ|U{Es묦%]T]֔AG43{I &KFDQRE4YtrRfljAFCsմ\~=< X]ۜ:y?Dsn+E@C¦JUpzdfАi)RQ82:ǟQӣxCbaF>;N :9*0n, V}X ?3'i u iyqtNn8#%I$,S>(% VhD-t)p݅~' VNm䙘qqbg=q|s!N&dGR4hfx)#I(qtnWˍnҷLuh2g5CCz(naY /0e=>hƥ~䣃HKXH1b sMKE.,˙:xYdڹMUPWpO״x(D]a"|,)`#~O3[wdD ˗ُ27fz?/~lb-oM> +W*P\\_t`'b4VqOe7PKh?xEZzb5)thU)oIsO~B-fK.wɀaێ÷;m}CVm/2 /|sJG.|-I] Y)>%L'h aǻ,z{p)՗޿G28?On}xrHm=JاǿLyҫi0\C\۶\ieRZKwjlzW:j&ûm8փ\:?P$Л럛\e\k5&7M-n͙/tweeR-\R>] 70gqo~2 g*BO-T~m}\%՛jWDwg C|Z{pkgqszl '#8q9]u"ќ;5wpOPlfiphy#3j " [n4pŋuʖBJG=J44en{):aЄ7V@e`:AJa4XD`yR,l23*^O-&T逮O`x;TBI/839'c%X֙H7- IILzVtu^/q|;xjsŅgo98[l~BJ.&41ٖ!J}bswԐmZ)M8Qxc2FeV]!g9NW+G ү~ܲ^J'R`^12C][sF+(l2H}ʵxv]5MMy]"TH*3>")ȦDx$`}>|`72$"MyLY#"vG)6QG^;./V/:jx7TOXAotT aX 9vG˼4XxE1^*m%QN;B"]l5NK^A/] {}s O[zU L'v;Ԯ@Xߍz;4kgsŻPB%8) &Rh{G؛/5cNٵg\]VnT n:ܿZ(q s /UuyYԻ};?VCQzRTfIQ7&߼^zޙks~Dڗ8bʕm&SY}vh(; r5կJ ہǴ+tlCcRSpIio.}^I-aٕB3TV#@aşSׯ $aU|k/B ,&ߞojӱi?Vɂ-kۊҵFN6ZѺaoU'K`t^ٻʭ`xcƓhfeO{FY^5I];S=LL+%Kn'?tKN7? '}jNpSN~|EŴԤv&"ǵ`D#i `/ҥDs-0' ԝj!gdioҹ'Eu;o3Ŭ_0w\KIPv_=04]pOƟszwG'e CUVlK* sȡS8C!f 368R$#𗳰 IEX؜p WwaXLṄ:iQ,sՓǶj88.^Džw=»^fYa3GݠsZ8.'xw2ZvZJ# G?b)<h|z2'd|KQ XjT ʔV0bc}B!Zi/8fpjtv  Z!I6Cj"pa<`axw`登!)A:=B&[H[M"t Nri "ncwa i.vuq3#J̿tm=HZ\ ȲĄ) gX[M_kM|t%6鱄nؗ0b@5XsTDcWx:)$0 6. Ǽ.r}pLa[3"]7BXN/2}]Hk=<lK9e-a(AR6R5 ϸL|IULͬ4H"h@QAsgG`qR p';5x-x<Mc b+#Vb)*~W%'|%l\_W"s{*-v#F18:Xf{3Ǟ9+s4gDrMG@i:e\ TIL,peŝ2@AHU`? S띱V$"092bZ(Fs+%"3̱֝_kNǟ+όXIEmƉ|ޞݼ97ZKܒN]en2koi3ft wozt3lxҭɽW:ku9Ի$ =j*݀.&9W,xzΘ[N|z|.6ix|p",7=a}6]Գi|K w-:pt6gn'tX;;??wp:TT}CqAGz&A'?T;\:qUX9~:lx¡qyLg=#-r!aF6-C䗒s*? ysȱZʄ2a^!X&HSE57V"FMsFV;|9^oI^8*u;S*z ,F SUX ߗt WzHfx\免AU^lFi"L9nC)8$bY\r[-Jtʎ4K.EUy ҄T(D=WQI=fiitrc:""$Y^ Co0HHjGHq0g{ăTH%\}ݖ,u[GVm}yՖ3JP/'(Nq=6@QL`ДX)dG!7?ԩ _jks4_鄚zgWC?jOb]a".^VgZ#.z v F1F'Cż`ǬYphX$11WC۸V!xUS.`z<`@02_ |Ϳi ?61Qx0I_>>S‡b&Y)7pHoT%VrTakoṬC_ Nil]n =S!Z lkMݷ?ZZUtopmm{XLIz;Exif~Jk8[MmX_okw9E6-67F̴0ާqٽ$,j.W_ׇ_n9T5Z_c(C[vC~Ǵ)x)Dߗ˃G00kBC mD/?9o:Sܥ& ,ve:% |b<)ob ZL  fM|˛iҕIi0.a\_d^rÜ%Q [ZcI"!4{OUk[D`}&l (D(R!J --% S >A"Ab!yKΰŠ(Q\՟(ٟf98v\g_\+fghsRgcO d&wtcJP5" SY !z*h`!HW,۠/{6Uʶ׃G6Y]ʽ ʃЈhffovܞ_K-c1s%3S2`K.->ήDrԮŃ .y%=<J ͒k.PHxni՞:#`HbaO+IKǴ.rdu^|ہ `V_]a1ҒykOb0ȕϮM}=7Wݰ49%qKn_ 9pCO[K8="p_ "PC mJ_C 37k uT ZؖI]*mPN͒s[QGWnG}ɿ`若,B*LiG娕q9Z8͙4;*[z{O AHRVa S띱Vc&`W*5Bo;ٺӱF}L;ÿ;h|+\@Ҋj`@H`fMHۄFL214+kY50$5AZX,@vGEo_ ̂HB9UaaR"2+jlW:V9X)}MN^zlp:|3l3%MT29ak[ӺT-UqhO(΍B~CKb`ZplL#Db%+tC2'ZK(Ij Y$HD% 9&wU5~XWk6j-5*m2ZgGcIbrc>DxFbc'uKӋ\H☁ 3 t rcHQxXP+Ҙagչ[VF}9KqS4bgc];ֈ|!z1 *CF2 gIz\_`84 6QzHLzz ",iP΁%M %[ס1 CmA/MMuv6KՋc(zqo 䘴eVG & vA#g'^EP  ғ^|)ٱ)P@N/PwzmMZoZT(* $-؏ LW9q!BnzS\2ܞNN)bo)QX6}kN,zJ<Ϲ Tk=b"b9N}8+.OlŭY+n3ޜj`eX v$VGlIh ҅LAaF0gBbKCy (W$6!`Bش:Qc(A@YRZ-޲:+3m1PtWֲ+<&}zNF.wTQ%hV iWIi[z ,:WbF 2 ̥^9\(  ȡ3X$.B,P\ 3cganb _%6;"itL*" qXD% LmnDU 2M"{}%ꩅu%@J.A2T)I``a **T;0En;dZVhgF+2e50 \X 3l$$@PiR`jpG1:SX+1 2*T-ʴڣ B63ciH$%8."a6)-y8^Xe(&^ĩwJP^'C}t54w*KUݨcrY(bXQ4eg)F";Gc)(ȕ뙲 gvp+Ǡza(3+:mqkB U.TW!}wTyG4B1NXzסg&+E; ٹɴgoUh*+RN%J*U깕"/k a2k(5$]Aj0y]k~^x5ΖTO#[U93_EO꾥o$0Պ&*ܫS:wP[I ڊ!ňb'GDaS/|X ..Ƿ5Nw#A4ֶj٬2C##`ȥWOJ_1K=ٰXR!f/`~4NZJ&_*~Ҿe|}Kt!9M3$>*"˴+ V[*`^7_Է3??!ǣ7^}?}z{ӛGWxuO¸KQ%۪0'*Vý*@=*p_^zѬh EseBsrA[y\XB|B`@Mڅ/F&Nb$$ >IɈ8z+ 0E*P RKZ棖. NzlC\JzN tIڃ!L*‘,ij D`kFv_}2'r9֊ΆyfʭtF':X< u#NkS/ʽj@21&z_]DvIKx{&B eVL]=XL#l㍋M7(6݈%Q#҈\дid #(")WQmpc|ίiI6d ܂U(*˧G-VU۫=3OҜAU%\޶wJ1Y,p. %hPqx@\҃5V.[Ys/;T^f,),5:Ln0tYy.z[;9a/ {P[i*{@bX#hc J3kE&gQ&W,mw=nc $, 뤼koT=4x))-3u _o538:d:Rjsfȵ1O!L0&%~0~Ndw3SfI6)m:?Nٯ{dz>\mm^ڀv>P뱬:\1T)b+t@Hm'g)O1b1`boZ냩\3\n| |mô$KK>T^]5tr᷋Kc{qZ /c2_n "tu&鏯˶͓k޻eRQǩfn6[=n-pkB[Z-z%0 iMAaV ;S$PeC j3,B+-`ibtik#W2:?^ cX?RV7͍pzQ*z ahU%WxǬ<ퟟ÷)VܟS񠒪 <7R T35?ll5wz|NdTC٫z;{RUExZgzVK7_0wvx LqY̙C:<2C+( *(j5#8dEeDzDIldÊN@*v nV͓Cr^[4&t #$9P.2Ƞ703IOc5{i,gc5C ; XR4A/d/_^΋a9*-豉q-Hù$s7uT8 irqL͋|.51 BB|epx>%aoLt5; j'0,݇q|.Õ5:C8~ Z/LVJC<#O 3@CD5 #&)=:gszwg^77j-{?\w!M խjdʓ$0c܇S|#q̦OgBl;e$kO=J 5z$3r΋0SlGAhke&^pk% CyOw:.JBu!?T6_ε`I:Mw;m;9ӹ[<Íp]c e & ¼&BP0A*6xΨ廘5w>7|=YJ2$>yau_Zo$`Fި l$yMs3vWd{t?jX HZwM;9БAi꟨sf.Hzt[rL6ǔ!@'16:h-qXjb]/55aLBZ`%3&3`TưhatBBVc!l/N]i~{?ۗ=V tW mcmkYUIo;|57Jf|"[/\_(D"O.kRd&J@Y%$+3☜ePѠʁGksxY8MVc}Վ!T>ׅݜ{x,_Y"H}07W}c x#ȜKтM80Ke.0⌌yI)A<'AH3.JAIX!%@$Vŝ:d)zcp[(噑D{@pVz=Wqg5ud7'3'dĵ?q:_n>[|sin>^~ʤZJUm> 7 ܻ.Ro A?!׿D*ݾ),KO9.܎lRɆJ6]o=^BC»GH(?/ouw;i!rwazOPu\+w]>y$W_,&s qsd3ssݶ2̼ S AԶ1ֱ& OXa~1P(N|S)1jҔ€)٘i,gkbL(ZDM=^Kb 9oyEq</ Rę C'F'. C2hs`w>ȴ*F@!l}P>ER'@Hih.0Aȳ9L*)"IsԂ C8WSEt=s:Sl`Ki(+f6S'7D,E&ȣɊ"; `a/VbOL ~VѽUlSn~V~<&?:=@gpzN9=eFWA}'O8hAdxi`eSKMtP7ѧWNJy>j|܎} ̩?/VY6J]E_:-Ϲ㔸vDs( Ls-"ŀցAI)M9zg/?50<12 )008Bd2E)VYˍD،j@c*Ĉ:e!h.PK0 8;ړz0^0TS爰>K~G^qJ&ЯQ9=`KM>cG'HF.IG#@t ٢EIm0F#8 O3-`zzM`y5˃i&,h+}pG#HXR>+PJ2XTGgy9@y$ԩmd53P)JNA ڨXX}`݆=cXQ80tH'n+ pvR #^pWKt6 *i*3E)&Ʉl8l$ON&aEI^m9V0I;#n <ڔn:ߙ=WYlz0|CrnyVbG?u=0)+I B)#iŦM"HIx[uGRMTRҧ5-=b#RwF)R/6WMmZ~x.Ivs.!: Eƨn1r1zYIԘX]nӗr}M]rS1#6WF](:Rޏg5e/ZR֗s'`I+5-jE {Gs֪TSZ1>ЄqUSnQ\[I5/u`&@ĚVwߍpyrnn/ˬ]e?L j[L4YGqW;6.4bF?mݬ' 9$x4^2XOB~Jl$b}r"K_>^=ߧ>̩/Z/y7o Euz= +V[O8M=E1'VxX)"ZjG.iK/Q~;e4& iOo*t딛ˁۥ+8:EU TQ,8?FeqX |M$ e}>2eJēN5UDeaeV-m|-MpM[yc')PBV 7e03D:ǜj%Q{{\ݯl/blcq-S[~n`ɣC[ f/W%3ǫF+3ގewKf Nut%ZÍ :g>[v (exmc2 &'/kT9EqA_s H<{!X$ȍYI+]ٟ=Vه%8~,srz=q~p*wly(|@fo~ϙY㞻E L"rMsXE-"]A0x } cwvoFq=,|~Vߵ18jPr1ly,*Il㥷M沤^/2s*сOHpL-=9Co:%C=_ʢ+;E F5)Z)T%S7bL4s&LE4>kȱ=56xw9h@]Ѻ64Y >]I{`Lm\;y3m*}}MlL&=ΠE]z{=p?-OoʞZ9 =zNzNY-Sc~us`ݥ7]msz/40J֝N&H$W)M wT9z䍒A=}hLFרl5Ӏk53KvG-.U];wT+N1{ŘLbB/y]ȣgfWWn!Qb OZ4|ZU"|g\<:[rXr8k:Ppt.jc]y:pLԞaΊWнD]拭p}0Ga|N9 X> gCUs8蟞5sKrnWH@Ԫ_g3S-&wcж[nkloFlofUXބNkԨɡvQ5h'zl?v NlFvr[[ǪgV?:Rg\hR'K8FH'QsR!0+N]-_?XS=.Z>,jXK^M <[HmC6ͦz%-C !q3P?9y/o~<ۗ'?xs:y'o^{Ypɯr[VT]VߟӬiMsG+zva?ˍ }(7exɭ;sM|6}v,xIɈ8z+ u"( R{g-QK(>ph֞?#hw:` [ia&jxHH `4 oؚ2u:yTčAwDnr۝GγTγ6γVeםqHj";wմa8Ol1ѓvణ]Z%-=dR+AK&.K%FsKR\P՗K  G4'%AG8\OvSkIQmpch ]^Z>dG:-#=;?x$x7^2_b>LH\zTLMm:S%fhWyɛQ6Gg$Mr2F DNs;?\kfX9 ʸ4YҧBNRs7"$ŀ2)14m _еV n>q}P4v,-qJ1wh |RM>_My?WuGʇ%(򢫉fH{, T2Ղɢy%,Xx6#뼹.^cgmI %ȗZrV:Af-"wkhC|QMMY,#Kptk$|o:SS="S}&# RТ%N+C´ ГA@zi#&{iPMCۯe|@ҦP8teiQj4n?i1NG\|` qU8}/~XZ]劃~z<ʆ7(j/,ԩ02(迈y&Kw`„~_Ԁ !^rqT!]MĘP9,gW5.`,OyO*aS cŧꞼђ oWW\oUv?䴄 +fP_y/~%^- 2;;RĽXq[^ikZum }V2`RuK=:3U|v6bgnmQIH;ua;h %b<0nSEG cyVD,`'Fy{ھK?&w5k!([Σ(L*td26G)@ 2A:k',TKeVѸPi_D&,#d+Q2uci8g/oW5{ߍ, 0 Âj@<8㨩jaMz'"7e{D:6 M#e8u;v&suZnɎhHe~%vfgrpYڒ-Jm|,Advq2\o}ir:א+'`k U cJ#68R$#Z{t#*5X@ 9WN7r70[T`0ytkV6K niuH(”/jyדΎp)pI\ !|&.R*} o82h= @C҃n3XA$_J}nn$0uOs'(zl"r\ F4p. > HI9i}:*TT}A_\5\O.Þ_ԏjT|bg?F1AĽTbL%Ž\[ꏯV[ rM#0~oҤeKg}Dy&ڙX;kgbLO3v&ڙX;&ڙX;kgbL3v&ڙX;kgbL3v&ڙX;kgbL]6ѻseHgbL3v&ڙX;kgb 1;P2lT 6Sf*L`3lT 6Sf*L`3l4 6Sf*Lu-Sf*ojLR&DT 6Sf*L`3lFZw4 =ښ{$Π]k_/` VeXu->8QS? 2e+͐a  R\ٚ@;YXJ@LRJ/5ZiErARL2 ZoGzd݇1Ng:hMaЖk BmED`M&ZxS;8ܝo$rn *ϕPd:ĸw%l7񟆱3,u-HZ*Bں9ft@0w >& GUYU\e! kq %"R|)Q`)9GMt!.F\&RAA!+zq^.MO%TC짌 a7l.* BiK4* CmݵW?]YŕFuTA/`^ e}z֟z pm3E˺W-, R"Y἞R)޽}^ bZ(Iԟ7slrQP2}+D)qŠ3)el+-kR{B;ٽd ?++_@L8[YNazU}X ?Ti)tEJeЇexA*NH7No1vn/׭m󍧎 B9&dy LqˀY,Cg{9W!Oԏ~,`sHC[4) S=RėDig[Ls_U]_5Y4޴`g9"^OZI^u۟u\c-(6Xbq<LI' -Zi-#߶`NzyߜoZt3l^a,}Up%~V=N=tɺ{o;M~-N}ޞا9nZ79ܟ9C6 zMDZYAZT;y=|>fGg9_{_2L7)@rR8u)Yӣ0{]34SBz<;hE(nʬТeU\C*:ey,*mZ PҚ s:BmGvzOoӻ.n6eSb{ƿoGo`ɜ O&zK;.frFɐt L$(4]!lT*ųy+XB)iD@o5;ͼ:mdGbL ^ʋyO˵FD&|^V^yOo7_eΐBJEhM Ct2LEIz!&I&2+2?=6{516A?:;wzj8i[ޫz@*(Źh-2| ̚V(hƘ֐@omuғN Ҹ {+ 3'fO+̞VXV)c'p!xƲ4\ʔlDٚ!ȭU"hg1HC=VXm ,N$~zC5\ f, f(H HJ4lOghQ-$L[\df#3)Rw <1f&.$T; 5]LK!3sfN/~}rOfTmI_oPO^ӑTvVUR<@)ɳt{m@CBl !zCẙe'*s{V,ʐ$ ֓ 5T^DNQq%.ug7g'TvơPV慲G`dod=',tTOA%slc`6RxAtV+KKN-ԊD2&"M72MCQ\Lz/(lS*M 6LUY%J2 N՝ۏGqE<]mwʵ2׆kVVo"R9fBډbS3w;PQiA!dȁ &^ J$a"x] z)Iڅug7?lEYaC9"VsĞ#n }9'@4Ktb^%KZu֑LUh gLi 8!Hn\F ">JkLI ВJ?ugO_-J&.9/|Q|v= RVmMnHa7Z,&R%QDNN8iQ||Taq*P*֕znwU_e[\JяZ2)8*:4 A6j&$,6V#40Y xԄO)\54Qt2+Hs+4Rt#T֗?o} NzfH)}Ϊƺr @7ޱ~-ZVl`o<VI; [NQ: ڬt^bVV&K6!vز3l @,%-bٻ% $õ%YZd$_OJ+-E%t=alR4e,j\%Hut7T#ybܝ'mSJ=0vAKm^ kn;z8;_c[Q+ѬIhLP&#rf2LRiJڒj9 ڣst^~Xֹg =cx%Iȸ!!Ғǁxd"Jܣu c%)l{4*h#}s͆?[@ng$ɇOIu8("VxvCnӽeu,")u0ȣQ$x]BGRG R;ЃQ:0FO3^=zz8DzcY}pFD j$Yh}QJЈJ[R#٧mۛQaJi"d< _V~mR(fvFZwhÈ>| " v{xA%6aIk tɕ<} fF;ͳ]'*{iR׬ƹwz@R_+!5ߕ5ېX/qԚ鎛BjتVުpl)"Q]j9 7l[YN1O XƘ}x/L!$ ٴd7[6<{YeE/a-5ZjVksϷכQJAn3J/x0 D#Zǟ;I1N##&ξB!j`vu9зNEXDt=>E+v.ڭ4q~Wk/CwO).bbqw0v3^0:\˛V ZkO҅y~6#ԗ.*i:%. 鹛?w=ke-W@UƜTm@`md/Ѡ(5lԊԁOOW5* $oLרLVHu@@J/F)i0fSH!&(bE _+23 T[GNl@w#= a~F+0TA L;\:pIIs:LԞAcy%ΥMm:WWtbCR;{]o-5- gxl dq0ФQ/i_R# θdl.*UÙg?r6&&, m[5?{Ʊ`%ȎY8> YĈ"R,o PԒhyl3;G'$: ('S.vtr6 No,&h]6sT[?6Gl yqo ,̾oV_#^\:\iQd"SNG)lr^)&eMWxVLwژfx6 k_7o_]^g[:#%ΜۏQNN۹尜ە \j > %A:[r{K]ͰfTw3m?GFF۠$7..w NuO##[αjMLK=|~O >u^xAi5ny+[+UjQi~!qj9W|Q`3Wf/dR,Wv0,k{+=.o+:t UJ`Ծ7ѯ|Cӯoodz&qWV]˽:yEy܈=gG+;q]<!1݀@9w3h$ HIJVK`^63ȩ\ixs"$>+ Xדa.\fuC_p TDM`Z R1/R%.\o)b9poĝ½Q%ý->֝/ʝAg Xwg?#5gѴ0 &*@Dpu*rQQ:3!(I<;%3 ;3tRpZ*F.>B u$%wEMr8UvYʆ :T6*Dc bsW*&<,֝q"JzMBR{D͊ώ':3s1cgt7ev>zDS3&(;|iɛ;x^/BT'A :TY"|bg=_|;^d<0BE̘ ]`Q-e45GCdt SZI`%hKyAY \@}lI=N#F(ɂ%ƋźSWd*(H$Rn6y0Cy.fؓLKfss_^z)T7jvVzNO:~c`pPվ:>ͮ[WhaoӪ~siG^'+<ԋ~0*0ƋϣxX Z蛣~?=9:͟ŏWsJv5eYƒǪ-{n/cW$?7'9ҧˏʂVP^ E>A2[R 7rewկqNO+Ѥ|L@7n 'lñ?ge1rKqdގ 1] fWy5hu._]i`,9=ÙѝJxW]dYlj4DŽt+)/!67]"$QEd83'D[;N9Ya<m]2W=WBW̹,BPA'nB_hǟj:8A߫ejr_˜7J#7aֻ84I}uo#{uy:b1"Ʀ:kE }P_9# مj"P<Po3$hr~҇DP_Nq#rI,>%lRyI8\[¥CF#ݛn2Dyj_LPM T[m1{y}uS_it^rg92򐌆@pE"7^%i3/{$}{+6ɵڂ톑4D1TQEiR=)kݯLֽuo|b5'l9~ߺzͷ;}.t˪DZn  ZZ0@PēQ:DfQ)|kZZPY}67_/#042j8g@O㨺rȼ!Ć}\UIn ~wh|=c}v{mKr'俶5`5b^NL*p8)|k[rжöGZO4%'@Zτ#rrrQVUª۝n5ezᨎTO8 !ZׄkkL-uV;I2jNYVwb?ݽGVuNHA@('IptT䢲N0E69+dZ'= ( wv@z&s+i:j,9[*q{)}lI>OŰԭ iVRأGBG?l# +YWP1jIљL<3mX//DS+cn<6Х1hM 5IV[)B IQo}̊CY; IIRjTƐ# Q"Ȳ1+`7^0-d1qj\6 e,N DaOM bٓ7qx|y&ϋìv~RouCf2{/atJ=2ogcB3!%Ȅk1Ib@uѡ(6iBbY/b2iAETXj#D SP-ƤWJ^ĥsފ>Ǘx>mNѨ;^1"Vʇelo;S4GCȎ̵ޢHImd5{ͱUsH"M4xK#Ȃ āsyQ 6KQwꈕG%נ ]9Y(OBnA:Sqg99hMCl j7Cf7x{T+^fz[ǚ:3@T=;z|qԉ9w{wrgצcf \뙙c7Outu[zw8-.ܒvvfS!d=[Wkz?yYgF=j^){=d寻ݗ:1QsiizK3ey[ߍQp],EuB^̽,^wݽ9Ud&&y]RDVZh5p5tT%ڠB^ZW7/|(W(=V6O `^,Em;?ȡ9~d:[KiqH85Xs\(Ku6&0xH2pi"M.)oYJ _^UUeY?,m< c(!P>Z#I6ͭm}D%ZxFA-5/D>rou]ȁ.PwKy;7;K:%3F ʙ *J8R$kvIj!YVh.PSǸ F+2"( ugC1ls@2T+w߷l]v[aM [2Gx*sC?2im8/qRHAK+<kau& v'k{rWuivHbIm!R&G9} "\ CQg^>mR萀P΁S\{Xv.8ߧ_ܝA~牠0Z=\֧_ (&)8굵!"Z^;l>) ==m"5Q;6*$0Q%ɓ -@<>ӟ]Q-W>c"^:D!h͙"e6G5"5,a'/ uv2q2EܢчC8AQ'sa`pޠ?)|@fAm1r>)7ADP0AQPހ#5Ko"wޞIhۗj[.{aXZ]"l~DҒopGL8l| 4VH^HO™:)֑T OvAd tkgiӅ'*3/=>X2D@#wDSBLx0U<}>G{!vIky88CǎnC ?Ul C'z@TƆZaY.iJpP KBh XR'PsPX7bźࠆJY. #$qC9B½$$ARA mAzOenE!]>O?>y]@7jDF &(Pߊ4geaZ@򨍩̯IM0κD#U(Y)`w%+ N:c$Ci")7 v┡>hTIIX[TM(JKbݹ]2*daqWY Bka/22p/6Y*p8ߥ%`HKphFJ Z7fV[jBB½Bb O!+{60ceɕA3:n*1T1Mib}.xVTCV](,E/{kKP&U` g"uD꼵# )S02QuJ2/T޵$׿BSv?_ q0Ic &Qˤ@Jqsd]lY`-{Nv7!JBtq\D(,#0  jT"!.Ǿd4qn._5`|,8~Ȉhx@]/L3"jrڷ}6R2e2Bǿ ,6hYG!-CNHՆ31bŧHZ0@jWGc%GSߏis4-+.qp; J1 lW1 %v)5ڸԉ,AfuŇǂicCw7<|^b`W!d? lMIX7GEM!spE4WXd6!]ңG~Z,1? %Mr52(ȰIH`L,մgg{oK"psk::O㸻zu|պZxoߕ0޽_{YG8Qq*1``wJ9UM1% ,uMb-:D@w@AW9#gcBFYRZx)edm;\F8FĘQNp=oڦL>q}{@oa- 8-TU.y# 2K.: i".LDR:k&G2+qqg8;hCD0\01WYvI%R0hTIA9c $gitIWX mo߽Ջ`KD|_LsT,Y#R,9eFAQ"/F?`ŀ?  $ԭHؑqY@4l !GKcx3їeG Lp8%mzEz N=)Uwy|&iq>ge.gII5\y3}#Z|n+Z跓?%M>goy2٬B M Eҿ꨿.e$Bu-WvK}x69>kwK."uHw w}]8&s S/riyށȤ|j9IYu!yl%kN)wzI/er[HHjv+ ؼ/>̤br1b6_|vcpQoO筿xr^l geXi~ 3`QeM1Y'+ w[$>_BϺ\I0~1\Mn7{A雵8Aޮ;$>}f7O3%q(Ua^b_Kw;TU8PӉ>LNXN7uڗR.K{7O4-xy B]7]mpǮsD<=-i*>;$Cϭ7?fytO=\_ Xk]00@}&g[į [(}Ÿwh/qkwYe+oBqXJ}0k`v<wv_Wግ>hFW; A(Ȃ|&V\FY8_> !eM6VxvRVG%Z% E5զ&DJOt-~ڭP>t{?qw\^1H%WV*}b2TF"|~Yqzz³26eg-Qƣч͊"5LJ<58$qtP-Z㪖=Hez85ko&hUUR.UfΚT:*6ՐVlQZ|4?ܝ0ޮﲳ>~xIBPCX S5Rv,mHtώOVِX_'K8}3׉gN &v2fe2|gpp>LWSc,A'(lqzRׯUuݕ5\>gl=xEm~ނ8av($%[Mg<,?~%`\^lݔ nnG]X[}+_߼(''MU~eQ=_̖+~L/79_tu {jIZؤ0<6Fzr/&=?)iUEMI< 4yؔ"f5^&^nS>kSrLHV[BW:M+31龽ޗ9tμ3zUp0"pcjM:$4ﬢ+^Lfa%̬ZTkN:mq:ל:~O0 Eb3}[-v*0*~;jO'&/h`㼂'Y|><U?iݞ/ʽk_~[̃H6I.e9;ȶ³-]K5\g“m'O|RںF w=2^nB/|j7t]-~PhS[!dn+qkZjtɪoM6fWmmJ姛\jwk&>kKo[i=X qQQNйLg:XK*)m]4m 2Vz|灷= {5ʋ6/*K+n_n^kW5yz3OPL]5^-##@Q]H.=0e5oVG͏)oQΚ{m12H2ildVQjdMM5+ﻓ+6A# 0]vQ/[#nW?dLL|1=Z\b*$D@ 79]Cda4I Mƪ}.ȑ/ZJNZ  qcV)kղ-JFD;[I#UXʓSZn.P兏^R2fIǜ0AЍ\;#rL.*;(-4VVGyHj5&娴u4(j DFG0)SS$qji;_:"dk`̘ƪfpj 5G0:(CMYXmC4ZB)6"K CTBȘC-`T]6 b0OoYc*hXќkP5I`6h%l -{pè#j&@w폼PNO8]yNR%Ӛt("clyvL`ELbϰqu8~JB0J5cBj<$m@hi fLa  $ZɁ_4$$GM(T6fZ4(,IZGI=˄1$%hNpP[YjCˬEgErV`FY)dUp! Q@۹HSH*2[0@c K.·vVa። U <(1Œ XthsJd!ZJ ڰ*t) C(h0 +SWg-a5F`C,Z%h@+w]f.R(b[՛TMBJjH lP`&8<z$`<^"=\}eLS2YGeE1ɠR{He &e&fP57]?  1yD`(VH(@ Lh4TRD&)d>UƨtPX,pĎj`܎<@\0 Bsp`g!LpF#%\f 7(,QGxh)B6 Nr@AO)" uJ.9h FGe] % ٗ%e--Fx0 ˋ [* Hlp 2 RDA՘2hV6@"H;79x">R6Bix ֙pE[݌b^,B[148_('Rv0')/( O`M:5u,aUۜ`|gtM]$KpP ǀoU⍅q@82,}\wmUEbg @KE405n<&#'aaIF5-Tx'S?+C4(LtaXXrqq/ѱV[d=حNmx(ng6 ]"s2YЩDA|* 3rmA4S"e%D@w24}XX@o7A/O"8S3"-2xhWpc Qw`q fR;ЋQ ᨄK"B,r! Fu5q ((BLFPCXRrT/a"m -hGX1AK^ϠB>ri X%Z]H1> 4i29Zhk o:Lf"Pm ʀg82pݑ?e )qM\;~3|1),*dN#؀H("p)2AX<i&i$yA2%Ɲ6r$5s O?`ai!g;S4hgLxfMLQd*[$ 잷Eh;hi a$,ce}> e#`HeY@hV8py^-N tT1t0U3qjF ̒q(X3ՠ62 &iwGQnG¨CenuVg,e98p%+;  z;:Qك(z~=҂ LJp^ 99Cpw MQms NNJXK;RԃC*` *-1'0H9+d~W bU`"OOdu":kc%ˆkt-00F8_N(+nQÂ\$$QAd`bUO+uUt.# @Yђd0'21\+\OD7R+'|^W=@6Q$ - 3MLBx!u.pxߋBI" Vi5:;/St{'Q,(B P P]XxHB40q벩Px}.55{-]5ټeF&@:`aM#ZoM97%_Gxw}+!L^u s͎XrN fa g)KD& bB2A&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!2 쐘@0 Ȗ8gzL J@B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 L׃~@L ZO Н@jߙ@ &x'Ћd FF&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!2 ?&X/z@Ż&H^C& b@B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 LZaZQ 5-nzp<hwH)N۸*/G@mpr68i ԍjk2.:3|!\qԎYgYj*G蛣մ]T( LO>EH,CԳ0?]V?q/0 ˦À@xT_LF+*'zտ):$gL?BP`2h;"ޯӟ╚9QK3Kj)(-hd6p0pR_+ndYemP԰:QxNձ0a<5 NӬИo52X-մi p&%ہOP?FM_FMaZ4cQݎOm"A>Fe<@Ce0nK9V'*r-5c뫓1ʜlQ^n_n-e<Խ]n86\ sAvd-E`f:Z| ۍ[rZ戒PԾ6A̦|1/spQ(ٖƈ%ڥ ?Wm9صvswdA5~u;X wB~PAxN4 7 4RCtĖI41PAvlkZ,n`"e0{)4@R$~>LHi`hdaFU>Jh::c.M~@h!<솃Mok *˲'c:vג~/mۦk@*Š>9a<2B3%*d}>2è29Lh="+cE*Ouu|YX g(E;f|059N(S&+SQ$0D9rYh(((B$^n4C&~ͤl>̃` JY ՝w՜PWF'N.aLRY:TkUwퟣf:?hKRz& |^wZDW-xscy<*nLOڲnvUi%.~`][nZtgqx4]7\XB`C* Uq? >6aʅ]ZC] s7?>mi M7}+o0ޅt﫷˭2nh Ber0-c7h*WrxT%4B1%qB=S ]2yt3@ihOV!dL%o̝I)8O3 VZ HFv3J_A=g*|ۓ3`r}}׼}zY2M\}d6W.$)B+ nj:`0?`*?s5ܱY5EݫX[z[WTGPˏ8ml|1GF]4:).Ԧd"2%K_3I-M.K18#[/q<x^5Qc/}5BsGG ՁzHFc*0t *Kα=c}0ù&~\]j1U`1]mk=lμəm}:){jCv˪뽻P~ÊDX'̓C,.wK~49OŶ9@>{ۜ[mqdgˣ->>xH<-znidt9jV:~Szߵzd=_{H/AS>6&FN̷ҵS]>3̛[W1a|9jvuF ʕ% VX<)WoC'7V{svl~}F{E7LW[rUr5`նΐ8܋^atH.E : j(^ {- t1x~z%A[|kǭnݱ6P{ELHKkUHkGzS2r|U{\- %9VkiM^xp)¨C}ڈ)JQ+q1K.i!^:4"ˉK/q!br>s˹<ʽڨ EgO}\'Solٟeܟ.>Oƃ.Fn%'7ĕf;n<]ݵ^EkSE{ : ~/T=_z<5uBuvBnm:RYO\B.~0c`;/2PMK-zJe*@8{JPkn[5 _<5ܬO7ڼΏGyŠ=m ^?~׶p$TEE=zZ(S_/Y{umg~]yJȝn:O}sC|s˨Oo8OF_)|uȧ:yzY.уHO6»PeiJd+;B㒶L_w<\j1o|>߱6=K/R"Ӛ2kvRZGZK;uH@b'b O$d(5TƐh!9uBԅч,1 8#q7\n{?]wub]eB$h>oF L>{&43~K<[AHQH/JE0 I&)Tf9URu%.IVgw  0&&U/qw⬯`@ʾ|v 7E~S~:tlOwlh`ns :_Uzۊu}i!.֤eNj QG!{EL0l2$h#}"g$kҵ<yt}\ Qy}̻me3A=^)b -ܰ!*09<(_B]BXG}&QpGhBޝ=:;{qvUImD\oTԬ KYgK* L$ {+I2<&h.46Z>ui5i@v!l%V$N!'%1B*i-pU0j{aEM'֩hi͗k=byrf;VQ\Qo_\u)Yt_OIhWEO!rFgi1:SekY[Zc,-Q[c޳-q'AO 6XeRy1E3IlA`&yqjXXmf싅2  oe\\f~_8o~=lдEag S&c<&:7hАԊ[Am-@P=I66ԟpdWe!eyt7LCAjc_Ԇʨ j vیWNr E늟[9fLډJǜULduh8zD^Y 9d251E6h":gd5T,Q v&x2lSCAjc_DTQ 8 ϔ"r0ҖsT\:f"e?lz~ԕSN C{ 8Ҁ ʧP3Ca-oS^>$!uyNҏ1sU8g6fޫw}M?}sq&) dkDdTd LN#AIkC%"I廕[`710ȸ !Uc L6Cu 671IEspEc UR[ x/c8;$qp?߼/Z91pCI\Oojp~Jٮ-75ys?aO)HiUygN3'O}o4L6{%wᬙ<l>zja{n}?>.i$?r>]-5xK'(+3d20suO,Bgjn[$LvWf׼t5)u X.k3@]!QvW,|9M2ΰxu^sʆa_1SӍjD> eZ\X+ 3['M+ GLR k9&Ydgfu {aǥ0sJ̵S gɴw努mMͯx3-ηmA?7ўu jfYp |*`KVw~K&5{ECb铄*-i؀x4!k^HK -9}ܕC&w{o>2{oizfj9c7Os>/!/I -xҪ$1Fσ֪$aH%!"Z7 )q!X`h@zxD&P qeک,c 5 \^ې&Eqhhqq[@ ]'d*ϸ.=T_ k+Gk.M,;`T6dVS0Rp&F:+2SXNZ4و,HFT0g[4g5q nדpXbpeQR$K5zpRtݘYzAiS2(Dg!&<[N1W>QdyM$E\[|aU&ȨFZ"ju"zU_SooZ Y/x^muNk6{w ac6攱Wߘ[=drQ6 /e|N+lEndE2u`D3{w>裶>zCj'-iEbeJegmtS%2zbpJV3U "Rd\3G*{Cц\783(tH'{ֻ뽥7޹J|lXY=;?wݿ煨aZ=P^i~/ JYL7i@Bx)ќ CSq ACs $H> 4NHyTsDBr[%VN!Va Q%@0qVv\NP1iܾhMdw|!VːD,FH\FaCE͝&ƙ 2O`AAy(o>]Wݴ=1AM/[,veoMˤfQ=;m.q|ZƟNj(֙FL@l\N97zq(xH)<(xHD+h(ҪfLgD*5^y(:*+^st[|';&IT$WbI::h:ѕ+Jqx.5u[# h<+n/Y(A],B̑-L„-1LL;.У䤗(&^/JpCpEQTɔQl2^se,6bY,0T .H (CǁSs>X;w핋M"cF%sc`2HF-5H#"{EtYDV'z+|D05bI筴ʼn*xqPjWn;O_M~^4;@Җ">WbF(u4YiIFOSe_bHy]In5i\2_Sl%,O2x7"+*5i/c3}Gܒ1P6eK&.щђ*|.|A xq]}m5K4rpc`,yLiLU.hkq8nwG''ma4YE̸~Xjsa[`u:KףKl%ٯWHMFl lP(Ԋ\][[_h8-igw* 0Fk=Z6/kݦqwoo7=(%Z mt%hrf/oqN՝B7MyUul݉R d,[OR,tpEEP[i(1k\P;ukuΪXtNzWB>пH :t_{x{ǗxYl­/#`twnIdQWGӴEo^6M^F,G!M[Hh=ox3z~Ze`LS9Nk-3b G|q_0G9$i]RH8840pͷ7X1߶`X::^M/8w~8<`ld.}wu<ϜOQ?㫗5:ϭ(p)>vޯW zV095M{9}^I!բdv:@l5>;ּ;۝mWRcm3nO@[zfk:礷;K)1|FU@p3n?K.Š΃I 1>!ע{qӸG?Qm&`1C\6/ep{ۅLIM>"mVbT-\pG :Jv#cm :? @OH4iѷ{_M6]C64_{?(x/'fpWG8?i2 +/{37v>Jv8cح A$ +p9+g/1=Vb,h~O <4X⽏o.pf)?O|,nA  {a6PMjM"6y)]Y~vñXxu֖ (s+Agv5w]ru/4>`g7< B܃s7I:NώNv޼9cCvyΆ2X'|p8_q*t$AC^Y}?[)B׎{5* Wuۙ6㚬:;}gBQŸ@ 0{ߣYfty*' 7o3Sҙ[CщW3ĸa7FcKʄJĻʪt9ɬ,vtE&mV,Y/>$؍]v~:y>b7p\I!FBv/rLVv&leViFVٓ~5rFEK4LEk@nCg}ʹRH[E]EuLX\cK4nJq}c3 -io-UAdlt1M*FKݫL=SܬhڠcSt/_jVn›m0YZl&&MjB1&Zj/fBuؕr mqmsQ Eg)b/9%|^{ x3hԧ ՙ[áuJwGu= m7 *)sGW!-&xB. QeuK>p :GBF?[Z] (g~N޴Tڇ,HU;cgDIz -ƟD!b<=(=nΚ iTE8USI{k%hH%%Ɋ] sr=ʜ%xnku}d Kh]RwLMZ%1įZ)|!4gmDiivIc@r5I:͋Eէ (хk§tp}E ܔ4"fٻBKuXXe]!(dGoOMt_5NFiF*oƷۇyvG|4di"YrB,GSе NQHc C?C]Jr XAJ._>&!Fuu $$LEQ{pYf}Jh v% Aq|t %DWHPl5CkHhՌ6!XX-;l,j3 B2Б5;9vEQpELMF"h4$?yP*ZE Q8TDYU=CE¢2,,H#=/]6] ˉ6BkB L4Chˢ9i$tK'zrI-0 YQOWpVW4<坩yD]pB^]GGg;Xˀ;ФeF9{[ޣ8}s衄Bu4 †!焲Cۜs9cܛMp8g=Q>쎻SFO1'CTO"ky$Y@_# dVI &bI &bI &bI &bI &bI &bI &bI &bI &bI &bI &bI &^™DZ x:$ ɐ@Cx$4IۙbI &bI &bI &bI &bI &bI &bI &bI &bI &bI &bI &b%(iDy@0W'CZI O@_# I &bI &bI &bI &bI &bI &bI &bI &bI &bI &bI &bI &^(B>)(bO?~'D-@ I ROi&bI &bI &bI &bI &bI &bI &bI &bI &bI &bI &bI &zH#߾ޫY^VSq{}\PZ=ݯBЅgd#:M뚏ލu|D>^eSuM}6\պ}Gػ[5䠜>۾Vʛ6-sٻ6cWJ.pٜT; W, f'E{}Õl6#0?F%Mg؎p"0TWnzoR'p\% %9AR@Y />yO~J +4S B \8ۿ/_yvM6+._ D:L#r)q4 WB_bm@l $?b"}'κyp`|m<,?B0]vY񝿾w.R_^~b k~;o%/ Pը^Ƿiiu&?j+XEr7}I]%amZKκ{9G0- `qj CR6w+1r3+%@/*Q+@YPɬb Dizqg]` 7WoZ x3ln{t~!UBW К}/nFa~˲tEg/~n?x,i`m|tE5:> *(t^Ų4XIFʼLt=5;j$F5 a jg9rjg >lѸyUܲG潸y+rl(j'=z~ֲvT`åpz$CQZTb̑$qwzJ9='ľ,*aq^l9}N0*u:s!VP,f ҧ,1c/DꙂ9{:yN ezR:%c8-08/mř\ ΁/ÆpoME?u!bϹ㔘v s(A ǒ1#AXGyRҠ.SLzD8 "`_nq =00hhc$dH.1F"S\io5:kE9a`3t8]̣>*滁u"v@mܭl趣>j0'8HA#uI:`Ip!\œ ;a di =t~{鄞y4BDƒt'MbJY R2-u}z)q9xЩh1[(Q4Q_lԯA1XS1/(A 2QBA0_*8B>nTqJ/$HD;'񨾨R"F(@<@͆ d-=?r-Nsf.ؖ ʂr0FkJF<ILu|l7oގYܬF.6%yw&_AIpaTӿXj0k`Jߧ&oH4 ok~5٣EZ+Lan7^Z.M1g(16O&^fwǃ :UqiA%rNVҡ`~A׾px2nX3ן|G}=~!7΍޺>)- -)˧tIrOiY'V%*{BBՇw|R?s7]<~ ~ﻀw8,ӧifY`?r>m"Eamc6GX͞!Ŗ3ZwydZ>kokw?f;yjE-xdڰEDBjK;B 2q>gYMLgRzh!֏xt)&ш+J͡+5[̜yЕ~~lA(o,~$Tr!iV93ZXKXb |&F+.Eg'!<Bd9\qL.ØQbrBW_H֙'- IqzAF~ a CUPJug- ON9Dm~ 0Tzbz2ј`6իUpյI0@"T([ /teEUx0? (A悐$gbG N>ЖA: &VXbw4Y N2 (KDpʵ옍`rx MG* $2k>f%#)*K*e>5.dAscvMћ9>pm %>"INZuTo#UIMϻnT`f>i.”*Dӹ5m)P-9}Gpy݁5 QsAt+Rm&S |fbz L)F3H`Ӡ$ gD-m05u)QHBÇwޤbbj|H,ʲ9{X% n B | ݯiGɄm>zVf4ǃ֕Fg2 ( 0X:dE)+} -)){ăΙ$1˃#6YnQ3HDK2)}4g1s;@fwpZ|Er cV s3TE5fYM 1c ;}w!~_9b̫# TOh\vDcf<*"Lpr*_~Hdzmbmu@Ur[}~;<\{-3,IB8Wn>&s\ĞUÑ,(txذ{K.IN5< 煨Q{rju;忾 JY Qs*E deISH.ğg#A4i^ywR\Z"3 j<+i?ܒW`gFE$zJp1O %8=#V/B:!%g.d8&Dyd. lT*7V'-# yJ%$o۲/ay[zâ#&h@~`[ce7 .uzb{[Mn*-"TrzyU"xHE&`` Gheq 5Α)}䂇 QWL$K\%FY2\f)WԟJe%k;& T$w>jqdQZ+CCkU0\/K;jf`9'"}oXf -W\uK]_HJ=N @X(J9.EЩW>SQ9.yIU%9D Il*9! ZR&)ctIƠc2RS"8FRM\M(ؙVYo X*')# hDjԆFXYXfNT0gT" @wmT /_pa)vL U"b`Zhs0-`p B<_FDr00-nUx=v<6A!">j`#Y%HЃ/9l 1+}J\SCF,k_ (^:H4ꯪۉ=`k% Narb *1;\ S!Pcr[Ln43y+{jB1`TbIR̰នllYBmHoD f xAc.$6F~CQmU*X: psz[3 B13ϙ!}GJÅ|cb /_I1(C $IM~SO*6xd\(3ޤIms3Y=-w"v\>aFh/X+PQ0E7B)(+3q8xB>k*aP& -,Ng_:Zn$fXfסI\&X>bdjA2%_jItt4"7q%*] NɶUC!LfC;]Yq:/1&kc.ɼfߛ'/?\LNXg+:9ΜO*'R= ,귋$ȫM TBZbhmKnl]3d}3b}3MHjհQÆN.jpNxf?48~ j;uZ'hHjyxRՉnd >=\ ; 1}78u]|?Mօ6L'\4YB2ʔ8!e\%]WpmG3~<.\t oǗ?޾|Ooc޿߼x/Vm~SCn4ˋvou46Mۡitwhwat05.b~P3_v@#C\Kvܙh$dIBFr$%#N&S@17TH-SVe>j钒tVc4:$ bpDSq ~h4AI#\';EΩ$h681Nz+.I/$H9)>e}cRo<2G1a;|~#1g2Ҩt2԰`cB&;=/#gf5#uZO`pJJ-a6 (II"v:ŋE !IeXFj*e;B;NxK/v֝.0q)D"Ga] ieWzܒ TկH1ޯbz8КiQOp G! E銃S8wEYn(_NL5bᤣ\>~X! ~A}[wB,u\q 88n? +pwt Erj|[æ*n<+sV÷m!=p+2zd"=`]l2`Aah'+S9, /tB8O}(e庑eUczkC.ƽ9@jX'0x1(_Eԩ09ctV|s1Oo4M^{x߃;AIawFCH3W:ܰRWµ _6}$\#Bb ĝ V8h v%;Us.ОИ!r5›=[o^Kb?b|z1WMZłݒFԢY۔c!U7GPgG?*s19ObgqH֭ϲ^l- S!x`6ަ983YN*%:Gwgx~L+)Vjl9J$sPґ%=o Y8 8TaK.@EdJۢ"|l"JFcNы{,ռcO#˸u;CA+ P ,N8j*Z؀&{D{쑓 Le^km͜V_䫱Ӛ`7R7'Lqߍg&M0)?$LJJ-Ѣ23;˜NYwa4Ysrֽ޲|berZ\T6]GŻ imv̎_߽mZ wuy7TeƘʖy٭sɟXuc{ڒ{֓5wI5tߚaaᠽƯpHa Q9CZIKvJC3 js3]q],H}|T'T2<g 짛Sd60&0&H&IC4PvvղhAăH`[n~^KϢG'ܞ$j͔j#hxU\g_4 {U(&{KPW\Ϻow?^|' 7KbG2)np0%sH[]F‰uXDٳ<>l~~(a}r(-vd #J-%Xl$ʑD$1`5n~=eS~>V85CȂN[wTx;e\ DSTIL,pu TD!eH-zg85K"*#chnDtYwvjV){ li\7lyK!tP Wn8:.\:}pԇG0/׸d:hMKX:K-bm:z6xwqv8tZ g8Kll55|wMzXk`U bFǠ 6:5 tX@6iFE݉NGMa5B KWԮggTSd_$f'6fn=-ꗰ*L}<|ZO^R;V!| gau}$mx^]\X^Z;g;wrcC %u92T?nmj K(]?.қ[*ْo7f]!,d9]Wizoyy8נnwJݍ~aV5|zȭ'`-uke-e` 0GM,+&9Zjb$O|9?Uicʲ fTd?\$J!X99 b TY[FH(򥖜N)bxЄ_Eu` ?{oΩuߏ<15GiT#\ R\ٚIF;X '.6J!^&FҊkI1X.XjOV{O~uE8՞59-6h$1 1bjÓ c,wNq;ۀZ$\v^7l\} }KY KmIe=:3Bw g>K<,d-.1@DJÙ/7> ,%gԔ#l2Wcrr,)X'\( f ap{"wHnzv)G8GNv=Ȭf:sLق2 tu⻽$&Xǃ0I)`*R5W13<;45Q"eOИGPAc,6ʌ ;}G` 6Utt͞Uw8 =K 8{dW亰ZhܯYPdh7͒D01,ÚbT<0 |scEgU8\=D=N58ѽܯ˷GE^%H_yg~|vSg?{ J/K&1 O~F =>"PdP n)'&"giȜl-seNɖ92'[d˜l-seNɖ92'[d˜l-EeNɖ92'[k-s={RvO0"eN92'[Ϝl-seE ֱZʄ2a^!X&HSE57V"FMsFV;|x$yJ[Uʲ faTރ =%2@JF/tJÃ&d/$3B#BɹrYNv_v `Q!A*4A{MbfK6i aN(EKf-DoCed⾓{Kv`%l;82L(4Q!Σ\z+yT2Ñ`{9w"vxH9DŽgj{ɍ_)9l{R|`q! 6A @.E,^k$Ggw=lk([TwuX|`)bLCi eH2OfY'tQs6 ^_GݜGr,vd},Z3= 3>s8rΣ݇ٯ *4w@vV #h`|z^'INXcQV2],&?D#<+vtPM9"̕ 2d2 ұVj&Fcl^Öɇ>㌡)VXl̛ڟ5~Ԇ\]jn˾:iTQa*>IムTVMp1(A9$.WB|Ef{r0x*{#{};7c(mY#1'6M^,qon?Ek۴03zNy`% w?Ok )UAWd\|@٧1Jɕ# GAseEL) QZ\XKaD62$me*m :|%JX@1?$_uxs㞊Z^)HPe6#8%7fk~]0g1f۶ 6JPB"SFOI2DVރuVܯt:R,mN/ϭcYؚηx zene_aP(O.C9=T)R%D!>SRJH)Gϳy.Vw%ovR1P&UT)RZ4A526~dlg4f#c_,ԍPXxP,wu]--+܋$e퉻_A#6(' VQf`p6&8U~ AHnŦ[ɲقm.:{1**lj~Bv*dWطc(lQْ|V܏q:.池vѱ/jCcԆ v׈7 1dj J.qUI@RL)hJ6M0Eo$̖&XI$Hdpd5T"SM[Wi&x; ǂǾh#q@]#^GD6䴯h‘2e0`UϿcaͦ"bR}c2e(IRq&<%+&{ fZ. #b3q|FՑqqb>XgQ/.Ƹh\pq׈w \4F*[U 0C`Gm\V˦j]PՀcfX퇇;^2ߕ7d?>OځlxzKXGOeGav:\vOE\QX CW.Q}ޗԆxLHaA$O+49I(&!} R`Lbg*"Զr8YG31uNҏyܽzA_j{/xlorzK#Յz'R ^l,;Ϊ\.@,dc pDcvu`x#f69 %yFCb{SM YF[ Aaf`i39p7wl/RYÜ-tz-}+vp+n\i8p^,EIk[6Mta *hS2zw"x!<o!t`0Wn~ 4V`|Wd3EWSXf4?J4XWF㔮Fs|;/=aK/Yn~^v:?mtADRwY^ͫoGxoB> Emӕ֑FYM/qܩ\1N2X׳r|U-/YR$FO:?Y[OڹdgXsWz#υ_M//<{ ^WK{-8SH<\CB\6ےX#q}Ֆ)RjoIì^bpp TU_@z{҉.i)#%zX^j.sm@$,akN g3^'KR(6~ҘvU~3|XQz7'9NYkIKUX RbUcB۪ĬzjE"#Nd˞@?J|2,qnHc@jv9.x>{.?gd~|t#$?`?NRU nh~q~y-V cnTcݪF3hog/BvQb]&V |=W'F.[ޜJ򧕝+lg}WIRk;\@ m&;Cw~QwNf+cW]pBCA/"Bj_#v#uJ  Dp]d"1D:yjb¿bo T{Ҟre( QRAɑɀV >GC , %.dҤ`J!4`rL[dQե5Q[+qc +wyn7O(:ߵn"1MfQ2vӈg \CYXR7Nt,f 4RNWyJPѕ!vCጻ>fnO&MKF>3-^E($3R*ĺ5J!9G5҆` QcC-[IsR"kp@GDlK62  uDy M\^:Ѣ*yxr7i]nEon!K?$mEND(3"TAHJ)j+]L1Fp8vEm맚Zf. @BH%;0h/eHY4;I9I.[:n&cWKYW,F+ bPI(EO s 'M reaǷ|< ӯ^i;?)3mv5J+AeMJʡLX\*<U:Zbz0uO.x_No >cRWmo{$nnd!0L9l=U]qB]<]XaYAq0:;0hs僣[aGlb9ԠjoknT.< [QT=hT.2#:uM1ࣕ")KmK K qxDZlߜ^4t>xaE|d(m  Y-ئ玬}[|WgaJXxF$J* eScbaGvH%y $~Dp^^|$6\ E`_Ґ&%g$e #ZHhkz k RIh:qA$"@EOE !q LVJ#aAi6 u)rY dجJA+>$IaJ$rTe,*O-s?מmq6;UC`Fᷓi4k矓# {'#G|!Ի SMS1~_J[Ue_b?PLs$năemwŎu9Fy8J :'j֫t9'+'ޕ$ٿR,n4Jz LχW0򔸦jl/OdX<"ECU;#_dF}Oտ@0YiC(XK\hF_CzSy:Z\\lXO{+$C>ٵ)ɤg_M+%ѫW3WɣJĕTt)tJEj a2m0y{ebR.&ϳ6MRgUtTWoޕ/܌/'΃ie̗^˪o Lⷛ$ȋvn?BM%1XKj*4#,C"3/T i'uGfezuHRF6V-ͲQ:Is> %G,S ; 1ʹ=78eg|ySFinmV#,\-C:6| W}M.P+؁4ULjb^gw} _?ӛD}z_o?qp]6Uan< ?T?߶/V4WMۢhl÷(wnrY*c1ۨݕ/E 0 R|l/\E4{s=IH6v؞mTK  G4'%a{p269֒*d8TLvG lhEʻ`(F+P"jAEƶm[f:_DbIYm'Cێmg; ^6aZG11x%# s0yh4s~Tq1/uv\qk8Ԙ݃iа5L ڑaNLrJP\gw1Uu#>`ENG78Ŝ9s˭#:F jyɖi΅c.u~ȖTQqtaY+ Q>[JIvԘu6Y`V+&4n9fidUeڍsìlK(S. fݭ]fe[?Vm{uAƬ|KKQcVաq77 i N6Ӕ]YwTjC5sD#NW14jؐakLyTkDDCvDZvQ a'G:ZO'`y08z%%Җ0 eQ(C(Ql$ Zˡa9$$V8煑de;B@wZ`&^%9[6k:m (a7BE`t\ "q棰.qY2+LK!^|("0cI+&I~&7l8ft l6,weڨ?fy1u F,76~??d#{ ;]:_3ȫu"?xR^'?Ne'ORMNNY\VW]=7g m\ΪW>6e\VN+r~hI7YngPʔF巯ɒvO5i6\;9a*e0Xy|J֦U脢QrVu 4ܝ,z{>o>٨wQ>f7r֤ͱn,dEsJEI}qfRSvuyi]ƠQ`8Aye/.:7 ^}H3 (x`6ަ983VD,`'Fy9G:z>&okCPN5GCQ9TNRx7eyRT7E<`\ P?X\*밊%> "2pP!(јiKc5o|:V}7['@3zbz{ [Xl;I$![cGƎp/e Y]"CeCue߆N. |wyojl(>1}YiVΘJ=W+,^)D.K0t5ߪqk[+6ڊq!-8O-Os^Z"}(tUafG?3հvi Dy s9WŸ ,E% &bR9c)NRnE2ARL:,,5g o>eh2Ng:hMANh˵ Z!I6Cj"pa<`axߪI Wg@z!43P̥C5~sbkXj#HNyԸ9vRѕq;Llg iě(T-#B@sD$7\zRr5u 2F H9Bfk` J!BdDظ(޺R]vˆ^FcD2Y+냉KM-a$ Xy$&D[FUgK5.|}!vavR. tU-b=QyF}Wm9e-a(AR6R50S:-g"xw,h>L)jD(14-Xhm 4wF ($@ ;A ,i l^+q X`;(1m)Jr^i,Xs.Cs>W+ZW.gт⍥ɑ9P1!נrkeS;(2̑(K7ΥF0X#E& EZyɰ- NCcrld]`B<8([X1c2bA ihVkC\3, “((TN7S""Ք*I"P*8[^Qn6Z!~BBG4mp8GX-n=tf;*wۛl4tve 'mNe5q7C' CJ\{sύ4ԁXŽ^ nZnGCg (gnm66~1 S`a#6+r(;Xbs!a*} f\jes%x_/!$úe%sI5 01-9U-!Ӭd p% HfP츲675b՜aܘ.ݏЊkiXrƦ7^/G;!s]|Ey:8\% qJ6ME֖Y-.ۺk6-TkY\?+XnE~\L o;!u:RmtfNgf!!e>g{ } >N+x >|jLU5n\ܸV̥F̋0&-]o"y쵲2aDD&p TQ͍QmQ'(<^-It>Ui2QYNy"F 2!5 5h)f DRx49#,T1kT T{YZFQŬi"EtzË(GLxo'`43:dx{SiQ2Ѡ;ZSBQQ E XZgeGɽqVJa1*0`RgG/6 $UH9^ywۨ6i:.cWb1 ccJQLDȸk) 0+H bb9rpՑ?J?ʘ^zX)v@1ʘ# fU Q/iLԊuZ,}:է]A"AbPι&%0HraUrj6Kp3"ا[L.YR冘,X4mZu6 mxtmJq6An2-J0K>=^_ȌopYJP5" SY ! 7Zc5A4 q-VƻIk ~>ɆNJY@ F+m<~Czo ѠjͦdUe,:u~?S<`s-:2NyQROg&=ߧ#VQR1qV{Z#Ҙ/{vb 8d33d60R5[^I{~,;,˴$ŗzYJ<a UhYy!׭f>q_A&I,֯I׾xaS F|_2ݶXLTOhm#tn܆}|lu4(f D}-ԉ{7.}!:ư{ERƨB$n(6&dMLH%V+vVhHeݫ}<ѽuEz[u#&h1hoG3;p," B Zc*YIM4X-#VyP @xT͸^kbBĹ[Qx: -Ȕ]]l]*^prK4(dVsV 6S(-6)8Iiʣ,e#@ -i&4G9gq qp=ld:͒]qQE5​fB{)"yoNrEz|XFGMRHQ #p1pq_<1R#@Xp_1WiLޏj{?^鹼֎>/R\QpZ q)RB1IH Nߨ#;G/ &B"h'DHU@#X!|3q!<>EGˬ;S([T ]EwqOG/XM ~[/eaNS8`!9?n2?<ׯ[|b75(oݿ!l /13%]׹Լ=0Y%RdžUo.:ޅzRȬkjaZs[hPyſiϓ_QNiB/aSƳ%/~l79o,9\oj$Mϕw5;yt1Χ=n_YFšK=|)oƼͺib{ٔ -L;eUV ,Dafם|0]\1qpќQXIBҨNsf0`R@ -D h38홇< 8grsp *FN4EK - ^b8-,zyt\;r#{U|Em"}PWoLnF?hC(3{l>lwu3/bL0H"`":Qy2<ʥhsHȐ zQ.'(KK: r&VX"K(r-9f#59H9BHe*QϸzF1ʡ`V2QT:P)qqoqyL{/J{ob>lԥmڂ~jbɢ9_Ϥl& Lp |SEkF[:j UU!}eH_#6IZ%\4k5G"j,'Ȫ="j/sǍRȘ"D%"K.rY2Ia<9Ķc[ y*w_1O24!R))*@%B`j=nZ*B"h4GJ %DkP$9Ў/`Kwp,vلa?ҴUxIǣpVHJ4H)oISM5rTP'dA +.Jxn@Ls6. OSB7Ǐ[oA.TIoŤ՚QDPMmӐwJ_Ʊb`}X p j@5S@$|k<0;wrf;o5,{. /ވOtLnljPNkhbep}fW4 aPN998֊^7_g0ulQB; VyrF\Kz3^bQ;AF5읟'.ljpm!/N:w-=rWKkCE4hCStM5*OPIy7WGG޼?o_Z'9\։*|VeEE܈-dߢ=(5x^\Bb !9nDPN sSl^vxdIJ H%Zh^G8'8+m1A཈ꐍ,4`4}N]]dh;%) $Ah:PnlXGc$J")ўd1֠(GEJn]_C}+9tV'\{wk5Wwα\ .ݹ$ؔvʾ8Y1tt~t~sqx{=_h'ݢJwG-xn;T~?[dzwӪ3n\LYՇXAOUw27_ToQ.q\XN>pjg~xZ}Zpk1IL8A;yAF I_hZY[nuhOUThoutY#ٳ)hYx`c5IgѰ77n{\d6 ǽBKZ8jb^b`BFQ @EE [\C%A0e%0eb^r(GIw []­[_b.-v;8# f0j2C?xW[, x,yJ \6)%x֬D+Fݬʌx*) 2ٽ 7sZc>~ZZoigwjaeN>1gsx.ΓȝǫxpFKkv|vtX}ӛ]F2|e#oz0> GKW V*66mTq'M8/ />ߗsî{הkEvz~֦ y},/ZT=>/<?{zW>),4nVpvrw&'Howoj]mD(an(plgvRx[;g')a\YC]vFw˫{⺧.ԵTVCO{9{Aq;c\ na>,ty[ 73׆ޡ?Ԏ1F}G%?)vpdg^bQnR#l;_V[u> HϬ]{\О]-_~lO\>UxbjWha,ILf.GE @h9$% ! - 1fm=֒P\&56B"5g7[Rkw ̘RB)Pᘋ{To%sBgxT\8/(̠hZ'궎懇mg!_jK4j9* LZF()]p cp- w363 a8d/)tՈ &1bqѳvV+D#54(5%c e0>!jueH WE7 U+c+:c"& ԙXqrne#eK4Ԛ5'a\a,owHGmY4gi&0QIPЄ7 BYUۣ ^"r?K or@f&= HT_5lI޲0C:[jSmesmr17[atZL'iۚesMLnTF`ڢ+Z`8FL-ܦ-. 0ۿ4,E4^Uf Igy.j (GʪBs@m1<9hᛏ*3w}XV:(R(]&d*DPTH>"Kk=xPP@z+ըz-2' [W*ϕ+E@3XȵN#z/V1ZةVBʢHbQuT+F|R =ATXD 4.1#dn҆EB?zV19E@˶vȍL +R5AYJ>9޶|XLGjJҸh 'k-6#߃r5WhӆMg TfD{pkJ!-d8֞cl |V;ͨ>rI"VŪۥ71.ńEPK*"pB%N m0Z`4bM(uiU~e¨tECPީED8ØbAA&B?v.nO-/ThDc-n8=s} FOKm2&nc@ّ}V\-pn'q&_ &+PB+hPhP+AarCV -Ҡ}!\-: P{q6K?G>閴N8bs."+'#%<$“O"<$“O"<$“O"<$“O"<$“O"<$“O"<$“O"<$“O"<$“O"<$“O"<$“O"7񻊵soyMq:.ޢ9 >'dowއ=w4ϋst~6|_{.Cyk^,Sõ-FQv`EY=b1 /r9+ݐg|~}?'+&-\NЅ]5γ;Ovnr;w^.QY}ƊOO-pVsƗL d*Bۂb!Qs4ktoA3=/#hBĮPWZK:f<xC-9pSnofݒ89,'H^'aOya+JuUmQ>|c6l~;l##<-OE_ޖmw&VݱS/^=x)it<̡P +,cNpu  !ѣDvUU٠O-sj9)ZԠs3?_1 n7` B"*WSR.@E#WES1n }"-ACVͺR?Ve)aoIx40=*-)t=$tY;bMtf7]8ttR<uG?tGu~y,y{j^XR~6u4?s otixw.= Gx9tW= h> Lh{ۗ8K_T[jCZv nl;\ K-6ս|[VuϺ7eѩ^>O*rKu"٢W<ʇ(>GXQ'sXfzºn1J|6ysk]?t Pׯ,{y3/qikwU {jGtytzN@^8^mݵYnb(c|a(lwqr\n9g럢ݞIŔj;Wp.r|7Jqᯛ}pj~=ݬ|K̽w Zrz_׿7i_}FdQtCصeE]bas= B^ȗ;OFw7˽qMta*w2nQxeri=H9leu:ZϨh?G`8o~֣_"N~#O~q4&dġ#gEMӮҜ|A(WTD_j)Bx"UĺI,r \J~*vBM.Vt׃q-෽6N:zvSuuQY D὘GK;>L}>3̇/7unӠ)zu8lqG%#w!7Zpo{wXhKeq>wЭk -[ ڳ6P} !-Bѳ!oom W\m*3b,cbH >G|[jEYmqN :FO@zNC <^&k duRr%:(e$$s^> bUȌj?n(HCW}⾟Hf7Mt5or5unn=:z!tPƅkcdLcЦ<2%=#C$8q #Ŭ GB?<[A? C !GĈXjL1@)m61N8U1!P*ruY9V#ԻP |F#o%.Yqd. ŘܨƎb-w<`|)0 ^ji(۳BF^2E J4F)Mus6*TLQg9N&j;pX#VòŦhvEoYei⺾ey7T˱m.i#ijrdȉX-K[x#JDZ[\RL;mK 63fD-F32R8 W@| tUYsQEY .P)6b*Zxtt a@UP`Ft6mڣ~ 5/*CtJ=w7Mj*id9/ KW>I2"mĖ$*nz3ߡIT_G mofd>e@ł}4Mž=|(ݧo1k7_);$kp8[4/+Wz.#fH2]1\QQl%BtD0[]MMÂ{ NC(w7߹|؃kf;>4žu˹Ƶ?4ONm0lۯ mOdgݽa5oQA,ZSo #]^LAf&%ܺbɠrόq4h}xz|ّIc%FCi6H#ɒ[O{~3fɗ:̼50?WN|]5wyTݒX6j}#/=@b͚5W_ UfszW{OܳxڡqmazOxJi[URMo*7D 9J}jb_SQ 筶|pܢBmNN4& %b-Τjǁ#ziG.yp.D~ @WthB*t.+%bsNƂJri+EPL5+g JGMd9h{LAW7ap#e7j' ^$ $H ܃D[+ද>{YHwCi.jPکP?ːW"Hm@8, sUZD 6+&hZ+׬X㸪 *tiBW 3Rb%lzB} *u(/OT6GսsTYKS:*\jٗ͢#\::`C\h9(dr/$<4ߌr U0.ҪhKmp <̼R6lktPK,\>%fd !ӔR}$-n! FRMJ(哠ΊrVr՟o@Vͺ0*^%G&D@P A)]ul'0rI2 T$%5T^j%?{/u=l6Ks+_( 1D\gmbd) ]ir!jUZa>^9>?]vi0LY( ]@,xO2OHA#$ pro ǂ86wBF}6YYA4ϕYe+! A*07 ͈sIS"Gc?9"Ax9*P/4'(\5swPYk;a:JlvPV)ʈAhiX,s@  (a]4RfT2NcBkZi*2"FqC-tIU4R.̐3o;8PbMßٿP3`}^hL 켔{?%?5қ2H*1T" bk+Y=ۋl{14U\x< ;.)Q 5dY?–p?7u Y)bi|, F-K1KpeF&S g$҅ 9A>bV{R8EU۱~`r~Pk87o?C%BJq]" &L5~(9Q1м:9o9Mh]X/UwHgj閰!GN{e\Vq}B)Zl Jl2J+˦鞹܎cbKK'uUЁHb2%%Ar q\X^zi@+e"J,K–?3b,2v5Z?az{=i;<{}Y h!\';/ض]z/K3rq-ܨ4`}teuHHJ]fH-1QGZ w\\h{A=Et*XZ$], "%K1lM@F`֪`Ajg!Cwp}ICRO S3h85p +gF'fw0J-mh_ghQU$ZB 6$F_ ,$QH4 Ly^kjBeAFN|i4M6j|rRdRSNk^UVUp4S 9;ifUSfchcVZ9=-oe; Q`R49KSTL HQI֌9ajg Ee](z]xV]r~Wq bNv0bqh8q-4I=8EFSΥ4e8 [R*B%>bȩ ^ԦP界ۨ3v2!eVezz`}-rkl?-Rv58UkZ[ZGRZ[m4֪d2<]D;Qi3Ĕ̜fn^]D !hE4ꚘhPTRLdB@KK-Pe(n5b5r:._'|l|ոT+E^/xEYQýܠdn>YHh> #&+NNZzzRa5}OӇgPae&TZϾ#;5c|C\BяL яw< qo?O37~eB ~Np `2N' Ҝ|'-yŜ}a{0`|))gg WFjf_ǡ a04-CL&?lף:lp5nkϋx4;`|X^Y*bLX], ~g~7o0+aQ8a9޵n_(P_'w5%-:Xg{SK'e5rGU~+mDtVb&% ԯc'cVNxsuo+SbT$5~|T$fF>; -ZΟa2p: iMϠ~(AN wy*i(32K>e(%Jqcx5 g8bB@K2ee̸UIHX\o9C][oG+^69H} v.@R-q-:D?3CIF$NjR^JO;Ft6hmx<{7> 򠽱ZY4~W5\(Gɤl9-Iqy.sT2jmLPޓt0DxPr$O )18Z!$#C "%'0)Mps:IUC]NrMB X-;kBi;0#$Ouz\[SNjii*}' RhOV{TPX*d7/׸[kH 3=8IY+zIH0c8"PJ%B 1:EUt(#c{>DiǷX==AK%giƙȾϑl}{?I%f0Sv" rqc}`5zZmG(ڲ eAdGx !^mqM;x̴49HY(D )J ENAr:dgR&3ԑAQbQ9 1RFGȬaI)gR.[6[%+N|`{(6ݦ"!}"1C(Ui1U`'ߎ~)1^fxqyv8|4" k QLj۷ߚuM,UfD-J64%/SXO闥Wb fO{Ze<f91YGvkޒVoG/R=vXGxI}LYzID6v)d_ן75]mڴ뿲M8:lz:WKxVb*nj^?IԢU'{;]ŽxSG=s٨MoUFn i'}KDx>~/g/쌞a>h^ߺnl8?!zYvna(]m5ߥ!ߋ^ ^RSwnAŖJW,{OnK-vƒڟR]RuId 7ˏ7D 56?5ɾ gj7֛u x*Lx5H`݃^ #URiXorEּtT rJݎ=j5sQN4zgj=ư os!!\((^p.%Jץ3@eP){g1!YY-W}L♦'ꑌIĨP2]K (g#*Z e'RpK ɲ_'?ʳSOxXX}_0zn {"ǜJOUU٩`e%5kNOuq¬OfqNuDYYVkiUʤљL ,XP"e<ÝIؠ9Bco ]ۂ1h \>&LKN$eJk:sD:YN%7’BxL`j5rFȱjQ_QVǎ,.ϛv}"ەi)8p+ـ7(DBNR$U)9T>`Aq=|$v~D +n'ZP\*g~bL̑-P09%*ErDž^CTė^,P4BI>Aߝk^ˍ%){{L!)4Wd1+F0"&b@*jH_عѹ_GK0DQ2G0V&yuqzbY8*q&rqeéjf!Khx(e1c#XA)xn˕fj(Yl!po[ީ4˙[E<0 }jhў*)ǦY2 1#!H'#g*޻!ZOr&7FȜޟtJKXBC5R[@`v}SM>mi(ۗ@Rׂ|.ǧq1#.=Ƌ*nx$IF||WflQcWIsoUvN4|o'5c$M$4gV}n(N&8㒉3l$~6jvAB>}6i5CrtL$-kuz5-8q]Y'uY.V秩/Q}]Ȑwx5: iWVZ^U0eܖCU+ri.IWJ 0oG7Զ5͋{T4O~8i?x VP b?Ǔ|:>>iV]rV|Y剟,'ρu#[;#_7X?^?fydZ˨iSϷhjг1&/`GOrݨk괂Ũ9mt]*Ox>M3j贆㸐8uR|Iz55X\\ײI4J󯴕k%WIW4Іjکl84Gq|vBL_zs^yqۏp#.˿Jde:CPGB1ΒEDk 0Iq"IꆄS.φ+fU&c;!v|3ؤm}]l& fS b dէn5lm8i0^A'ϯ-=X"pq=#uʆAEcrBSHԅf}!Re⡓%{]/&۞bh7Op^?z FKCoj&c*<YӜ3K_++IkjN\E{|wt{?;nݞJZ/ܫ1Wdf^ɿ~k3R4gȆ|? om^ݨ~boOTךvQQk0ƉjTB6Fw3#&]Wދ7c|q2zyg'ZpjX9ǮQ@}t?ۚ}Vu 6{%}[ݷ9aC9:ʫB-#BT%UJ!j%}_Nbʼn#-_E^hA`Y9-#T+Ab4 B!mpIw{f# ǘW4dQJti\ƈ #iM 迥::늜*EXtAeqge)M8ԪrMŴs,r|I.  "ssM,iYs;yu/>;,9O9riYD̃W"x]zBI3))y4B ܂!Mld 8R1,0\պTNvg%.jW66lTzܗHj$HJG JQNZY MFf t+OQe' | `#z6/ qs"x+*juil`'=-~wtӎԡar)H`D\!`l9ͩ)QT6sRApt"8zٞBCe1G%%*DaY@\@UȲÁۺm%/ޤ= uEve>1A1x$uRVMn7 $yapq{{t 0VU_+w؎~NYQџduGԤTm/!c%s0nnTB7vl2)i'8;ḑi"(cU(*,U޵6#_ny ^,03ƀ-y$~)ɖ]Yl:pYIL9̸3EU0 Ck]og]wy+a}G??ː%iy&w7m?Cn{ dGqXO *%2 Q`2Ȣ);P)Kk+|5gY6!RVK@މRy R-3=?O}"<Zp)10]M(ّdjRR+!#yG>5Au6$TV`B&KRjG8ܝݻe7^)T1nh#o\n5-"14R;Iô[eHǁ;2BkTM:,=gS7q7#2cB` u.Tt4R՜jYXI4"H*s& [K(if|j5_GkE'v?lIBݡ\/.]U=yzADOQQ4 )p[텧uASx.T`Ȁ͉{48!:Eh h3Vw0 2d g41G%F_r`lT%֞ͺۂ]ApאijM-bP,1W|ls7te(GR˜}G2VRtޔ!kF hPr< NA}iTDʢ ZOcsMDZN,62u9)2_Ԕb B &tE>BАyg4(`m/YIYDbV)$Yj;}h`Oָ+h 2Ǘt9ǿg~%/:5lt9.i\.PųEQzV>a*?z=) tPvtQwX=#$7tfH[z~7 Mf Q^b STO-2ƣ[wa~)/Lѷ} R.5ZJzgets癇Jj}7.;QłԻWΘLiGybG[gE||Y}Cdf!?4-1xUfCD]۝}ܰ_Y{^z7ҥڵلMnC5/]zƗKUls[㯭y9Cѻ[fAnqn1[ݕ"W^i}>ԭ[ݾE8}<~'> /v"~l[ar˦Fݬ[ݝvd'n>̰N;mZ-cmlK Dގ7MvނdGլ9! G'mY ;oy9V;N[o=zqTzS={|`#m~~AXG3i%IZR}JGu]_uh1z" 6HWrBABeN)F us2c_4bG>O|_*,>e$;I%KI/gwo1t75.%b\MWd ]]ĉ{ .'5 Nk%@}f|%]6 Š<9B'<6$0I#'%%i"F3 M H0CDNV^ͺk)vIV,77gJ;~;m}J` .IW߉ZB4竽LjzDq"+ U^B rGΗ1V %žN0A}CSr[Ŀš 5gEjo)֚fQc"8W9rJM#PLZ tv(K۴uVsMO(>RA RZ[[(8SKhmj ]>Y2ǀ aY#)fOLtoݬ;/5 wBi*{bƠZcEOŐ9ϊN8S41ͷZ܃ѼќvMblr "+  Bd[ZNV Jd3+5m`o'+޹@ rjn>~gm۞6icW/g{س.atp=6 63"Ny;: :Q_Q0)&ū}S7KJN{`k%[u6'H v4dQurK&H'22wY)D!hO!C%Вlc6kld;Q F6{L"焏 Ywvj>},y]SUi6OUyxO(V5>ˊ+%)*2Sα^˂ċ1لQQ)%"TPXQng+)"QZIKГcRJ!'_ V) ug72*Ͱ8 uc, T{xjffqQ|͟7g{c |v^+){c:D6NՊ,C VLRBG˴% S߆*BRB)*lj&WX1* -1oԪ;&"1O͎CQ6 =0حc#ڌT igMKd kTNX"y#23C Ύ&eGđ1j^Ȣ:a^/=x+0 "6"if@xa(IBP`D}}(\Q&:Vx*;6:`m h:U*P!lJZlp ୺GEGMeeSl6KE퀋.n1M $,u#RFY>jnJXNA$+Ȥ\<. 6=5,!^Pe9 Kޏ/Z39p- 4XN@8KץbT}_18=A9=WD.1i,A{'O+4 KQ ABR0,̟EP17U Gyq5~Q:%S-''\x{uyƌl$ƫ/uM?>µRo)+Xɰ,RN'Ub"`DTen@ B+0n0DiJIH+7"cLdW% ʣf͐I1%`=w68ٚ weL>^6y܆ ;,xmm2}X)*ZGe! ffx)L :})(1OL.=g@ y ' yѡJj\H#n1! y*]g@ p3r|xXxD :i'L,),ԃ7xBFSjf-C%C:"RZ]rz_F|W^4|MUa]),1C qG R9e(J"[j+)[*Z'7?rӣhЂBB0)l=w"8R]yoG* cۛ4Uxqw' )qM55<|`>|d٤$LQ-3nbի]UJGE`x4#4M;+ ^mBXI1 "0cQ?oypb˫?_#uQD1N_ĩAI)슨\4#^6^BOw oZ|Z'itMe"Ex0I*cGYs6!]rw04ܷw:~͎S?2ݭi &aZWJQq#@3j>yUOo nGY$HK5YC-LRZMHMe620Z/AOߙIGmm@I5/QVݺ^;FTwE#W4nQx;W]Y,9ZQ0<&m<)㜻?\6 I^"Ա4r9OuogJZ1GZ61Pr;VGxcqiz_V'ܼ2]Uu7LuC[4okhVdӍ5/5&[)k[ |!DWoYjumdQK7m!mmށ.YWz|_.D_fp f/qkbu^/x=]-&0m >2IM*ĬcR^ޟik+2~7mjLk^/UщxGCئ\ŽN;>}7K8[RwߗL"aW9e{etôF$qȩ"R)-^NB~Ϟp=`sԩ)[!ZU )vCeу[b (܂\[яK+2oJƿz ~ƒUܝ↣r_l9R]|T\pF>Wf۰* 1Wڋ_D ;^2p"3Z"R /#:F_}_9bbOZOP%ݕ>Z|`72$"MyLY#"Dnnu7}]YKjW3z7:*ą0,erpG˼4XxE1^*m/K@O;B"]jk߀ ^YOm}ԻwB{]WXct-TʢO6RJcU~yqI?'}au8^^to)5g|5ձ_Av 8_cKzl76H`U?. _\INB]-1[jt7#\,oB ZYFMgMjq5uΉaLqwdW}Ղy+Ɂ|\:e".Jh+fص'WKiwxҹ/-,_aTQZĕ/'MSG m tGW_.uW$O󛳗/e~Wg߽|s:{go^})"8PW.6Xe_~zT?[Mŏ7zk[5Mۡiu3;=to XY(!3zdgf\&'=MF_$$ )n$GR2Jo:e sCR)ǽKNϽ.e枿Á8[ia&jxHH `4 Huz99 ^A2; {+8=Ttl1ΗhJyN Icg`G$%-=dR+AK&. K{UݫJ'-DA 8MF6( lnKk Nsj- Z; Nl' ]Qme-8ZGEdQ| \HZ-(7&7r>m'.ICJ+Xܑj^F=;x1ncu0eLLq듼UPQ1b8eQPi%f8:&d^7g而3Dv{POlƨzWR"m =h,XIJzJ#׫P0,/HMLc14RcibR| =ۋH  T g> R%N+C4 P`Q+Lt㊾)%[Ȏ#kE>,A BR E銧/@;e՞dz(_-L5Z=Y1 h:e ?- >Wi>-bƧϞި\OgCpD&aC5&dS쥹]/X΄>O `&}mP4On2uo~k ʶb ԚHKu:|}Iy0q-Sarh['ba Q+9t<'ۆe3oըFiVP؃wm7B88wNV[np "ĝ Vo%SL]dznt*sI [S0N9_cr4.sp-E^5iod'X5 #[5%u3=c0-AUwTf 7 8x}=L~N!X"~B(:J1ƑGڊhۉ‰z8/>$9([Σ(\*td,Q X ڃ 2A:k'F+m1o"XKeVѸ@)mل"JFcNы{,ռro:ȩ]YƭJ=Z1l=j`ypQS-0NDn. kjAWF/t4FAդ9`n}G_5+<,`E &uD%ԖhQjcRaL-~= O`OnnF̼x?^\*kf9m]HNz_Ov(@vPV}lLlJa^iGjd>9Hyģ^sĀSihTn;i0oNnnoGji4_U5t#=*뿆AR~{SwM}=7Toc1Ѿm/ehWݰ4;"]%ٔ /u2|Ckеk|ڢhkQ,diai1j-z*|5b5u!ߓգ,O9y|\BH*NY5W14WxCQ}.OA3W0k MߍD89|xGCŚ1\i>'$8md4tleI2$/N/)+Hΐ>JJhLJP_`h>rb7tt#: mD,IzzZMC+ȱuؼ>#V6C:)L9O'aՑ|X?*^*nq]>V^N@Yjlaa>+vhZP|xq{VȪ@48."x>tfvy LqˀY,C2Ncʠn2Q033U*!?)ߓl OɮCJVkk[kIq*pIJ1z%tN1 ,.UVDͩa##JMS΋3"MD}_e/Ҕ1NiC)ͼE59b6tgg1Z}2)ZMh9T)cS"M)]OFs^X'ԟ$j_8ml,"52ڍ@eF76-ں8eQPN.R,õwtԅ؁0+"=h ABIztXl,]v˄#X@uOr`J8S`iS>C)* {1 *p:RTm h zZֱ(/Tvf9ЗdN)qZ{ن>4Bw^.C^^]S%#eC1U2L6Hڊ\lEbojWhJI'")&{TSĉc\6RլR^yl\nZ+bSvhYe)sQgkwu( sQ!͈FqOr@HY[] )^U**d@1#3 1d\qEg1(@[*9`mDj[W!6#μ82/ĶW[Ul}1X9̦⬘Iμ̻?X(2R:5O|g>UYg47ʏb~1ϛ1?Wy*-opql~4O XP솄hyCf#hKu['Isl ~- wb$2W~r]yZnё_\ Lz|>R|Ը\{"?pLt#tX(h!6 m6Uf_s<~I^t>t|Sirb" B& # AiATQ1GEmpCCV! O֝!#eKTTЙ^!!JfY0(vʒ6F~y4FxmU̸B1͹ԒLjHe$*%RÀZAmG:'&~^0%m/U$b%Jl'pQj|(TѤd>w PŸd "DsU]Y?[e v6:7߆jTU.s":F dt.ֹGRXA Upe&1\ Q|P[dٿEvN){ܖGBf =vXx~u1xg*"8R>:ztn^꯻h|Dfa鞯>D63dȤ\-[mcֶ 1XF/ NFX!NEyr4-n9๬;,&vo^Ⱦ>mF,fX6돆O!:*CtIj y['=8H#grI.x/$l#l 'D@}eܹ~*vT #JeM-di~=oOzxB* F!tUpP-M 6lGjO4P h wpY,>TdJkxmAR8W醅fcvB;bQNYƌp/Xnm_7d2}rXqK "h ѹ7-GHJK+n*eڒ:G[A{e5 6r?Pv«FkLOw%~Ħٴ21O͎Ǣ6tFmQ{d;aK-C%"z j MYZKp-UNFnz2EA,daĊΖ`Maa(%(R,F5;O7qʨ5`<=;_%[[@>i}o4giiooQe.tVF 7'J-@/\nnv[:<kڑE .̮z% 1xPi?ń ~\týN)<[7+a”*6bbӭ|~!h+y 4`U4׺[W]VAָ|c8z*DU:լ']T@ޯs<|ޤ|idƗ? `q#`/ºϖlY}~.ݔl|1\x.Q.pZ@wL`8_i-:6=P@[oDw|3ZLa\nU:LuܮhWM^TT^&8^Ԇ%~9N.jz׀)<\4bgWu{Al'I9ثL 52SA2g{] q\X=h7S5F&"EtzAhgD1UzT!$GG Nʃĝ6/lm8evY7,@7:*ą0,Icʿ,"̘"LK@O;B"?ZCQ*,ǼfNnʺpۤy߲]?x;E%%?[*%,zJ<Ϲ Tks1ˍVYD,b_J%i`'M}EG|Ih ҅LAaF0gD}.R4ݗ|D߃gkrEsJQ wÈW#uC\ZT+GEA"1I|h- b>:_YRLy>)@HNcRɜ0̵^\[k & ' =]uC,n7!A,{Ӆ^w'&'1%u:ɺ^kg^)Eu3S4KYǗ}}w\NkيXp AeIy2O FצĐϠWIWh5SSߕSǙ/I_߼<Ƿ_zs:?뗠 .@]uꆰ*' ' ^ysYk]CӮb YW ޠ>ha5/c1W"D !"6 N"b/5(?m7q?IHA9JaE4;Ku9I{/+~z{d ]r< 1SYl{ /R~T'E"+}se0)^ɪҁӬ\tͨ?.9:N1+ݔtY.pִNvp3 u8|p㩛0}ݏߟ {i۞Wۦ1Wf%v%kƛcg-[ 9Zr;AfS̐b{"?MpFN, Tyrk29yC;hNPڱkMATe VGp&Q])p[-T&H3&q2@ ˃w4 ?oB8+=˹/bwG $wol^,v- iaN(&Ge tP[ ~= X$^ 5QC=M׿u@&#(=oQUo0֏Sk3lHbZG+a +#: (GQ$K[^i>,rzVZV$7q6LiIWI%Ūc B"{_iv}V#eB%RX+k6>dJ{G[t3~? ,JcioSeT\HLwgFz9Y7`?9,/6ːI*<\(MOln*dPaShznz0)S> q6nՋgG"Ye_N)BnAl*gZLSR\5 ʯ&L WI<.ߔQ$qX+k6adf\1݂oZ1ؤd%K"qd^γ7S"0t9?ԗy6 ]OA>fr\o/-QKyK›1zE`UQ;j#kX OK,Ă6dhP]ڹ?{Qs5yMi(֤ ֒H>[:ܖuőHԃ$w8t[olKIΟV~p] Nǚ|fd/-S[3a{[~uW %Zsېb3Im8ij XI|z?E]WZOWd7s0ߦ‚HX_/{g gsh~4ю  t*SDmMSn~6_꧶ԭD*ݾMtG%rU09p[sY#:gsptLtlOVς: "w #ᢎS"Xr ^I% FcYJ=JB>pڀ1U8ou'(HGa]^aZIBA'RTSqD=鋬:ӗvG֌/<|5e.S",@x'?N?y6Nf'(gdgA60p p/,upX~FrpM? ˦NA2IyuIy Z>G\ÂjxӞ+ykG_0Ƀ@?!KI;v4m6G d.eoSY 3:}]t߃ fps4CiP4NUi7pW*|T(\$8j~Ɠ._`Mer_@4N}ρ޽ (/ 6;wZT4x%0؃kv*]iy. _ [=$\#Ʀt]koc9r+~̮|&lbf;h,X|ؚ%G)^I~$2e@öDQue")_Pb w .#:BrxɪGh>BB?8Fo5?ɟs iˏ_fq8`vv5FM;DW_uWvUGߪW[?>/G54\&%ClؓF$AU%XO,^C6D|"d?+d^ń]cI}B3r-AXW!Y"`1$\㡀C "dUG]C ]z(XGR _TE^BzXRI@h{ձFd,E]:ٌ`$%ˀxxX%SN,-]ǎjU?#޷}j[;Ja[77Fz%mJ\J\O3 [cOMͻd`5RÅ3|WnF[_O},w>wkk,) e,dESFot=KhIUp#{ҽ]]=̇.֓Ěa<i,/g;OMP?pǟǓ/ kukۇZy{7R~p0#~cr/­­f|ݡo綁nhb[z*L!vRz6EZWHyWɫo4+-6ӧbUaA}y="3Qbr(PtrE*t*,! P>nZ OfبY][k5=q~~ފ'8NDڸ }! e:SAEFjhyat(5:vvH-z$yG]<6F3 2{/YGhB/DlUHd 0ƒ;Td *¾Bb}x\Lt,{]THȘI/K!pVhJJ%e@联Zt9&jQY댔{`+ dZCLV<Ϋ*Qjs7/7ug G`5F9Q7UNO !(] TEe2TD WHyi&qDrF[zmÇNEh)xGNz^ZrX |"۶0cr($,xP@yQ F@=8GY53gG?ށ2PcI@u1Xv_Yh.f:d_K-]ƿ|_8Uhe`?dH Df)ɜ(JETՊcJ>Nz]vx4atӡCdт`EA^h9s$-x聣 p򙰱qZĽG҅tveZg| 8e+TZSd Fb}Hh0GH%Lj9v+"l4!?ٱ8_!ηڅF `O`ٮZ2UHh$r6"ezh7dl'_;}{|W:p^wM]mBۭ_DM@Y%$&l@+T&nR}vbNzYHq18]NFg}3VƣzJ RtrN0@&1X8R3c2 1 vv"{6s@z..uG||}Y)ϾǶc5&Q3N Y>;J %r5 ct!*e8lG+@:_z Q*F'twE$ҵy4Raw!m{vJH aW7]xƳR$=x}Y6[ JeӐ kЧ[/5kOBbU^[W^{DV<92G I %+s& [K(RSx0W6[$5=/DD5 Ym QIJL5cD:S9{ZX6eܼf;Yahu2]|%#W,!15K\5$GR rT.F̊yU*"eQZ $o02f%-둘mehs9;[Te)@Ҙ, ¤Pjdc̜Ǖ(8,,LQJD1jf]Ѓ֬ϓ;G (8h\? ~_aoyr9g3ƃ.t6 ?x2=~Z'J O{իiH#ʡlQfJGY{v>ˊױ6Lfh2˟۲Ѽ /` ST.b]QOLjѾ0/L)f |.B 7nH˕۵f#<-07CsuVݬvuqEOǿ+q/wX{w_i{>Xi3֢Yi[ zm=`[Qz . ~WV>X͏wv~u:Xj{?3g+r; ~ !Ƀ¾[0OW_~9\@B\;Afd/AeF ȼkno0x80u5rZ$%_R橸*k"_R*DlI'G`=S*R)sr 50HFf |n{Q'ҭݬl[g[YG8WK8e.:YKa~IHG KJvst Pr+ה$3(Xm"51P&asyB/b@T6MAzS)H2)h@}**ؠHzb2sU!̜4)Q?s>3xv;b -/rQ 3fxQkL2dud xM=T'v+7GRP63R=G<Ƙ(GK1JNb.$sB-7FTZ瑃y*]6#;q^R}2 :[4jc0 o *TR^o{ݕ  =fԓHQnD҇F|W^6 ~c)ZaLNR8hD 0Mf*K9eUV{);*Z'/{:9l/Ojn|REæ_r-?mŁʃY_]v 'r0rwV@r.~YQC|Ғ:e8Ǔ _5il&g?F?F* *"W!}Sظg`:y7,=P+ԉ^}{i7~L: OhYʻo1<#lThRO\mYByŝq[#˒®mv+לqVgyv^RDH)T:0c/RJֈ49UDJ8҉}r`Biց`<] J"%@(#tAy2$Vn^@O z(9؝Ɨ^[y!'L8}ݞI?N4DKf t0{L~a/ IΙe2ߟ%nZt}|y7=a2*d熹fSLG㧃i2bOrda:0yMjVh 5(O+Q)o0;n]q;w^&{Ϭ{ǩ@vTHHb"x}qf 3 *܉moS18Mq+:2,"GqGk2~$JH!]v::$ZtP ;vcwVg,͇`/<Édoʦ;"O\x,#H6 L.b X)z)oOyS>vpyAz`*8 )r#S"4HS9SV`o$Fc4< u M)NIRWhpLY 3L H^<%QpN;B""BlbFj+ oj W-You]5R{+ XxEO97jmn#f)z˱ B| !TwZN|!>eFGtIh ҅LAaF0gD. }F`w!ܴ/0麷Voٖ:wJ7Pe8]?ĥKrĹQ$#T@\ "(p.̕c_ܣ;"itL*"cP͸Jb,"BHH&n cUbNC o&mZB>S+E!l,Aqc4XE]&+i S!Pz˯ZhgF+4e5d*$A ) fpIH66,' i Y)#mPhK DHT[E *X|NH_ C6s?疆#`UʀH3; }SK&O4}L:8'ɾ'5eLMag󺩡S\"П_ـǘE}Z;C/G`3I ZV $ "a%::]#3 ;:L ^{o%P0 0+:mqkr -jXM 2p'u8=IXi3ydRXcizzKS2z`e5xQ0SJ`BGn`ȕkS!LS8K<' Q&g+sΪǓymV7|6=/X6ye+0ENϪzK@ٴ zqfC!6C#1yaH0}edyRbT(Xbx,d1b0ir}SFm&6j\l>$CC#e_@Fc^=ԕ:Y#*Ɨ 1i)ܜ㔯}¿L[MEL ^RL8l +h|aNГyCM膩 ?)vg@$w'^|*|o~}ջLɋ_{¿LKqm ,ʮ v/׆_u64 Mbhu[DLeܟƅX,jwK@M訯5'[Z*MIOb&o\O7#)qGo7NYP!LKJBvl c}bI ሦ4"$49lP{Sȵt9U';:ԣIϤ$p]t7 ֋G/bv)b-ELq$0lQs6 +(ȕoGގ>wә"UV(MGP-B{nTklмFg$M(.0W3H-ra4 aJiRJB xlr>O[MgJ)7yn顷ե ~VYw~-#YVofU?|g: a7Y3̲3xvTAZ#({ v5砦r+vڠ?޼|K)U6Oɧ kke\k=#d3Q?0p|6Ys.Rs,/^hJ>h͑W{ɱ6$Qs-9˝R 3|}+Kn*!31)أS%4LO^wɼ6QO].UZܦ}v䂞P|g DSJmΌ6>)Dcr}gՖḦ/EDۼjK+gzSOs_sM~Z[r}K+0=I攓U[pO;>$sE%KJ_xR|JN<лY7KWx ෣cl}b뛗V=C\iӚD8I+=q[TW'?PV| ^:%`J (ܻn]u?[,jCxR~[jm ߆\!T:gjwpf hw5 stZo^ȺC JUt9[Bbsya>˾<)qo)Q(MmQ0ĸ@,\ol1DST+m>nb]m[db{T5֖~ɜqgl2P٩MQ5pP-hy `$ (o^{eiD`aͯW7G^e3Q~($ydY}=h`SQ9C)*,cO,V3A RQ(Æ2dy9x5A"p["x}kVW㫣qg&yVտS.3DBG5 *ZPj)RQ)\İZWJM zb(E#8'\k=qT4x̢D^RkbcN(L͋x57B˴ˆ(&V`B % ).QlƌL`^ˈiDk45[!-*5ewEd7 P ."[Ba$$ c]koɕ+>%@haf'YnA1zڄ%R+J3sn"%-QvO,g[$}>)ֹ3mDۤIM#b|!f,cWIx30fE̍E^3"Z䔄;z LxiD?c YV1+!dc쎚u=m m7 *)sR!D!C~BȥaUV{}E`{ d  E4vuqJ;d|HEb':fxyFa5Ixx5ARTE8USI{k%h(%%Ɋ] sHr=ʜ%x&<4}d S]RwLMZ%GYGW( s"_',H/"YA-NiOxNIbѥ`D)Ȃ94Jt)*k_cF>{/.Qc &`ҙBvP=5 ީE`5ˎaJ$z./VQq SNiE +ؽ*؀d@> `-iЮnIGDnDM2U _.JS$Ecʳǚڈ.tt%@ ـ 5Z+3\i!/ +a }?jEiZuI :*P*JvTZ2ɷQ#LI2f,؎yBނn]Y7ia;"p`L ӌ/%RcYLu>% qc-. `I. pqDVZ hަV J8 92jܨ`5po&Y#QLϱA&Э.4(2P#DG}jj;7i+KP-WwV4<ܶMR xY+N";>nmQ5]@,M$C.2T(ʈ>bM=G%xu>h P _@10 9B54 R{pUf}J`@;뒄 8< |MKa.:b&ԭw@7ۭId3 .-z 56W$v#M(٢b1j0(5(U/ Dl~7mB|@#{e\/ٿ͚ ,gil\O/Na{Re\g8K͙y٘^r[C {n.R 6/ >?mLö'` J&\7ًہAM>7E92swEuUзI[mQ֧y9.]U/rh.{Zܔ|D0sG9MB9ZYiWw5沴d38_6~q_rkjb<Gq&17x[YrFYhjn =ҙ/ZdX(- ?󨣟HW_NoRE鰔?$'7/m&mhV=菃??}40`xV6x+{tLL0{@qyW0O@;EV^YG|?ip4UttǂYZ6QbVs`b2,{4@5Lj@O!{4.rXH釾_nd@dY6F^`RK*o*[mz],/r(|wjiv#ݞUd-uqfeZju/ޟzAG_f/;ñoū'ĥf / j^UIdE#{ӏowCn"]nE'M9 FNG*Vϗ7rYo 1ҥG b{hhR}?6IX _m,|3;50KT?R{?/YO^~i+}Sݕ!m8~`wu7 Ėp?V}ĦW7'"DS9P+ݾkGrۮk]UXUoiAGدeK\9-թ- 3,7\7 kf?ϧ~DIgE{;}.tI-| TꦔJm)_Z@ VT&T)ĖD=D\&Aqwz-W6eV~H sg>nhqWpKM.87Cug{z\_wW}酿7|Ȓjܪb7r坙yXV=$%<Igק#rEgwɗۋsɨM,,5>*[ѝ*yMD欤A]l| v7{#Ix -xBvpuB曑%$T*HY 5 =M۔Ʉִ(_ɲJ!ԓ&E۫Ǽ`S9*-I^5KHǪFJgi|L706YGmdBJ~JkQ0%̱~:8BHw Dަ#seb_,7.Blww; {ϴ ; #gzqV7[ ;v5[:i.2w}+P H2y㭆w-ap03<-fc-jAdj2> J zv!RginvnNd4uYY"D˩QKJ \17lEyE4"/j˫WLnvJ)ӷ" d;K7?m7dO/No4^P8'Y9S?54j7?רoPB=;c~z8)Mwmx7 >a9O~xf3: \m)i߿I䞵 #؝mmmn;ˏږ!Gaigo&z|qu4*q]o9vѱFgV/Ml/q$2PH]>w9'b 0"$%Y{w()|%s=5]]OU?]U6A_ .9SntBi:m`~W[V]~|?{ٿt4]ANXK&Yb_`{!|w^M"Q]lZ]Q*@;ULcr;KTO/׿W(3^w~m8z4WDXRCEQpxO/֢ح~[s#4%zx-yhP,!1a@`[EfOn9@V_;}U4 ?IO2/$EN*9/tgqDNJiHwNg'~SfKh?N10wƿϢF+s9e.&Τղu;36, !Ҝ/8Zra-Z xČSHpN}Q_9}zǦXC9k {FPw_y6lݙhҦb\CW쵩N9@>=8.;I'{@e_H~ʔ$%7vρl$,Q%EM#F\[S;"&uaXWAXLq" }2.Tvx ;>[RsqP qZgocsYc8{~7+/ϓ;F0ߗ/~VXtOlΗ`u07b>܀<;ӲK1ዧD"Չ'@)HN54yt^[x1Q9!pFxߌD3&H.paTK-MhM1A%%bBiG(^,Q ^,&N ^Q T\b+HI98X $汼"B睊Uu: gGU?^ b6 U|MU滀hTտm?}oѠ?WߣR?BucT4݋gnwUK=}sٷvٛ,YuvY}5Q|8yr~~]I+;?u6 N9Yb<m-_]"=̹+OG#B^?[v4_u5G_5yWKD5V4٭4ξ?9a7能e8EňVfb(B\N3!\KBϵŘ^I C$1~tsZS"X\SĈuR>%lR\Q挓1iS`;G${4[鉊g?RWzrkFNH:w #A]Zh\ҡU6}c'c8֟/[׾=V֜tyc{1@X񭱇&g{|= ~M?g{9MRg^6/7'[omgRv[:cYLDPpHÿ1dTYt bn-P707Y }5Ū6 TW]6hBlUm|*]7ۆ~26>-1,XiV.J?v+r۳ 2߻aY&|NG[mi%PL%:#@&G Y!: k!fNSA(;+sOu,_;h20S܋AA$Qq@.v+4l"y񾄠Ej\Js*һlĹ?R A2 оtR_ >"dT^l(Ň% xX)KLhMpZ ѹ$69,E:)IVsi̗]PPycSS4R[M HbJR|2:8YlIg) 'rFhĚy&j hgIAL21jX8{%8Tf vLT|*͈GVdRQR`i6'^rIOINBCۨS$Ԫ@I9O1 sʅ %$upM@]#`9Q45LR'mV1Sj@'=/~;Z͡&P: \0<u&aJŰԽ i>VآgBGT-KӼ y߫{=MxD9ZS&CVP`CRTk)PS[ -rR#Fb< 1XJ ~"1$@21؈ZT:'F-s<(%i!SB \0(Ŭ`Am&XF3Y_w_pD}1vnB|ޏf tZz>9E=5껂P s2(nq2\btRqdMD"&dAEt0F. 5.cRJ<)͔(&g}zJϱE6 ŷLZp/b^(pNFB>` bS`WoM9&3NIҰE5ҿ| & gbtfR4BudbPLozayՔq4Zi+OHJ\˺+3y)2Wǀz]óBu26wfs$?{]GyDhr?}/Y{v[_%;EQu=٭uAuuǧm35n?'ww7yO|MزƖtIϋw>M(GK-wχ wdOzOH"{D6S8 3ǟ5ngEč=I$n:48Li{H#"m1+p]Y~GCh@ǃq Jh#oLj<'tr>SmY,R|FG ~agR9sH]O.tn,<`BYS0)CN *8քxJOAwuYgem]grawaxxwьA]^hh ]N:#Fw9BN5`Nu9!9 ր7;K:o5Kf,@3QGUpΥH8@\lIHAd"q4R੉c\ 28ʢ` AT5PL=Pd]>%~{ -=ZFͥ˻!ow 'm Q=‚H{H [W.>^5-WEMbۅB%HdKīhl-:&EMP-WL&$t$WQ2T\1rM Q́B$h*4LjMm1wp Cs - hjr#[J="/8>T0pK5 *$NεILjsBˤ񊨢 "yZ [Bh )h!D(ML@Iz\2iy4#\z|"9Ǥ-s VHWR 5gM!*1>7yJtJs5m<:cNPQeNJ3mu&<(f8ŝY˻5uCru=sb |)|isP^;qbjQnO^-jf %'{WƑ__m2`n7Yl;_5Ѥ;{{{J)j(Qذ$sZUHX^FpRN,ū7$vҒtdk\ {?kST1"HH W< oY[ P*R ]d/%1+3GJA,pђ YFvM6  *ghtށ;϶G5>}qΠ[.aY]7o<YW;hmQl߃2(,ӶRK*͙0SB4<<^{;P.q(BVsBr- G$ⵖvaFeCʤ"!*+c4F 0dU'\fTALDз9ە Y9qҔN:"'WU㌠vդV z;?>"3%*Z^e\ڨQP QsdޠIΣ`,K"m/$p;f_m>djִص `<}#IqRoH"qZZ6ub|yr6,xrPI<'=:xʕ +,@7M`⨝rˤ:_rQ@-QF$=FUۜI'2Zk<L}㾀ҩUú9\Z Z' <֞KvVuvA{&U]lT 7S6,0~`ȘhIWJ,:VeɦV%tGW Epx1r!xƲ\ʔ0f6f`"pDqAjrտ'c:<ӽ_>eC}[uÒ` 0[I]r"QYc!pw>43(j5I@Ɛ [I?/?E"''!1BVq!zc }כ킚NfC iv;1xW'{yW_,][ƾ)_V%UMP!rvJfuAe!7Eڍf~N59=+'ڒ$ I1Ry]6gLk82T9ۑ?]7,3B3 kIE3nXRayڊ'iPxs=1b+a8l2V cFX#J32C=2NHuQ!c2hR[\VfEP= ) lJm?U4dۑ^BҨt)`lߎ}tWLcA޸c_V=P{`qږkU1Z,*<SD;1tbBp5uˠ Wd!fYA&CX Z$a&ő0> bd\{#g;F bo/"Bψ"x3[Ps0Җ{\"3я8+%I$]>j@$c;}y¸ɭ+ AH h8<3rxgX8<9q'0#Ȍ2QQO&q I;"u]]> m#AJQߖݙN 5,"{җ"m| q2LF#YUBQ%@O1ΥPI>ieI͐x5uZ k}xZ|{I蟮q8EpGSK Vv­;9q:fraɍL̚:ի\W\SP_F$嫻.C%S uֆ[Ժ׍&E^BE-~Z[xIq{Ñ}5MbtD~~h6UqjҌf4+7uaSvFn6Oa/ExQ\3K|']}fʷ+>R-I^G(ow뷣yq<`v[[S$vY:LPAo_$Z}E<Qο'g vZNx/_YbڌDZL/.j?ڥ}"лaλ^injy啀Vϻހp#7\]`Tvg-F.ƒfϫgBz&םNRMdsi>|JwyJZ1ݨ򫾊M#rRJ`]q݅j"~q y5y|TC뀺Y%sOW/~4sN/H{Gt0Bq&۸}n㱺-O õ窃=Jofw9H p1;Z;v$A` j[mg-/6q7mXy&:,}!vX!ˊ_\c8f:LSMpZaHޢ48H<49H)Ed" \Jj l#7o߽"HR *X+$$E Ȗ EE ,U+e*hh?Vk@x{9͎cr]Yє^pէ"JMo6Eh&zZYƞѳ/gkdb8A2n7ۦpiiz|3lq/5T:}{;}[6L(ye%wY:9O]~~+PYu6Ӓ\ȎƐgQ%!aHĻHS<ZbeFoPxvrd.20(BU6,j-]AH$pyϋg#V>5)Ce[)EZ%c@T₆d.s.{!xmНZ(C9L?~,Y >wR!z!8gFE!'M Q+ˈ$Zݭ=ٗ1 9q3F9zWA,cDg a GsbfJ<|;̣!cGnh`r1Y&M"w0>R0]!BTt1gkrl2Ƭ!0Z~lʅ=J@0ą$luF0P <-..!3k \447=$zz0$黱Ĕj.if' HU%s,V,K,kM gZ8_Aac)MlW]\ {S{z Lr(:tU,{==A0Տ0cׂ{7fZp 49V˚ ZmZ$,>&FcKqY=X]Wmr*=h`jn<&q 'acEF5%t.Q(JTI"-T W$x+BL燍Uk4nh{  R1:t`2GMۀ8ڂCΎdA X?`yrx`~YQ"dNI.+I$ @ƒvݮ]oCLm]LՐE}+xmpi,=b:EjC$*f7PK |Y]!jUW5m0E=ƲG:&oC@E@L{5.x 9H3K:vU9G: hX%YZnkT58t(Zi7e#s48kإ6F: g$@P~ЃA2CƠ"RFd1p0",0s,*VRH>K4ؚ5'or!uf1q,F%Gh@eB#޼j6*gUn2+{ѣ jߨn%LMe.y5$oYBeڃ!K-hL6b1;yحatZv@.N糲U&A7}C` b+Z`8GL-`s`*ni XiQi Igy.j (Gʪas@m20ë=7fܵ{8(A/7C|樇xy@%pQD[9S"˥P4]&d*DPTaH>"Kk=xP`Հz{ըz-2'7+lE^H"\x>WP' W`!ZF& (XV2Zة !ZYPcx)U,JjňOZY:P1E2!s6,ѳdD).XZ#eFZ{Qc |,&oDY,JLTPH#4.k360#߃r5Wh;Mg TfĬ<ZF @(rfF8Jf:zAB~)#"Q%8Z{Z6b6DUKo&ebBv"JɦNK8 ID a8tBے+x LBF .QBPީED8Ø@]zFofu~݋˭my4⥃vBSt9*h[,n9ߍB.$}:DXPvd[vn6{Ga;?V s/=T.I Զǣ((ւU@MBJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%B@0W2s,Jf֛T Ԭ@Hs@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJW $ӏ@`ÎG s-?%@dXsT)%%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ RUv/@ꈔ@0WۣQZk^ + %B)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@vkrVu2}3km}{}n_5NŲ~WG@.ࢤ,a-.~l=AY3.ų?naP.8Mzś^o/&>e?gEIsn2N7ѝn;q/' ]fg.?.:m=^Xc8m:uC-0lr˰-O{1ǿt95ߘ}[RaaD;mTwC=XGHPi _oNן*@=|;ZJ؊2m&A3`w Lt%]b&_U ^5?S/M^-\r|1vrjkOp7Ÿizuzdf9g ;$6mլa,SV&u|ΐ1M IJz N5}`o4CY>-;R5>wo{3z`я7WjE. }&U.]VlafDŁa:dY ;P`ծTWQ+뜠ceUֵj'UѪ9Ԫn?9s[rUXtn ]^{խq-!0omm'JLR8}D9䅛EjbD,S[\tME/ u /t}hP[w 弅A #%mL޾f&O()vJfֻqR.,lR}YG wpe olf5h&@"aPs<`V*[)EJ)KXe[.XkZz bC"Զ`t> (Ζ ݀>cW@L6,PZݹJ;i1ZK#8Q˗-Eo=YiQ9LR;0i gw`ELuM$42I-7) vY5FwfC:g`b pc^ LqMfOlQ00.Cm:~~cDg329pCZ+cȵM0#p&ъ]]הZ޺'gvĐ}x=T@1mx,{yaP{쎱.U{vGicwt~`Sw4pfZcr;&wʉo\Oށ.v~VP1C|ЅɎ~K7/мA7o6 k$z4 /6V^mI퉾͖O\0Yj=M8'd_ed vt |<ܭ6y8kxZk.8+6+[U$2LI4Z(jн6|}/ebj? 8u,=g^NE-Nϋ]F/|ILWLO44KeOןgeO\=Nx*x3XvO:iw0pǼ][~OL޸z1QD/\k/_N'bdO5[J/>kknFe/wψ_TgU&هf#qC Iex3ΐ&"C7k/#bJF$OY=~@'~| }ڌ,l{5sBk$a gޥW"rgYoa>yf|pX c@4yS2!8eZdjƈV ,͐@0RN^f9RK($DH #(B"29H&TnILHq3)c*0{w&RJv*}rPvWzX%T?Bhk$:Qp0Аg9]]e{d;”]0b AjNyu(5ZL7~͔+_cz8pIj:MWCߎh8?x0:!h 2V]uy<nm p.W_x\A^Iһ\pr{L MYJ4k 1)@<ӲIQ+֞d]U+oQQTH!X5yfV4hmG L#v?mC@+=J%s꿁3 ̘еA.&gQ"\|=ͳ(5@y.hkls&NݍD@\%ak̼ mvg9lBP&tg[<L{FmYg3-R`ͶY]16ڬ3G _h[l0¤hr1\gNg#35ztS ]{o.^|2Fɿ/{{ѩ~YN6i`,UdV1$|Щ@5?Nue Apqxhɻ]@8*l668lLO8są_[у2'XоM99qI"rdql8l6!^ .ENw Y:ֿe|WhxȜU,Pj^l͏4G -Sĝ#7Gwh˜n}9~VoS@[Ba邼yg Z)V,NUltwq$^Η;om֔+(Q*K7FAF{NnѽV*)4Q%ý5_F{p/=Oŏ{4fI20&T:K$\^LJK+tdf~qgd&λUOm1(17%Ķ &h c@?"k9:cv`:FOHŠ ꍢ5nzCPToRn_Bs}^C3-EIiݘ!8f8yl{5į4 q W[9/ˢŰuJKhqMX5HQ*PՉSx"n5@9l漷^>o96qc]Bg/"vyJӢ4E٧?J>o89vnϜGy`٩1cNE ! S@0F|Bq/ʑ+7-r߻Rۃ{noI_Y^uٵBX}}\ywc\LT/'C~2{?_jƪVoZ)d0% M ̙HP,aN(D@ T"40RquY\E]6b_Nt}po<db\fft@I? fw ƔI D?%okeִU x.,2?j Ͷ0Y%=?̹94^N*с0 qwA{sI~?V>?X>#JNCqbS&ĩ+KS!AȎ6 eߵ)3bz{=~x-Sջ,%ן hy7Je#t A L_Cҝ ;х2,Gg?n QxZUE_&NxY1Y~p}Y7rYv!m#P*ɵÝ̲ԙr90_$׿^qن@aww K^o xݻށ!Gg ^1VL/x^E`#]t(Q\2G"I[*PSh $yGYQ32.SoD C]r孓 ;'/"WDa/m΃I+/$SJ\˵v^>N9ůX3My$gq[NfUe 'o*@jBi-ٌ]E;s?{϶Ȏܯ % ƼT"  d9hŖ-ɲ5^ߗ-VfD48%*ªt*~<'{lEHb-hm}[~V^vz9V=grQg޳L}]SCܞnzG!/IIڮc?+{G]fY>5FiJI~Ǚ'I_"}ܼe #B7~V:Ԭ\ HL˥1x=siVZO/*hkuҙV$ےJ5xǙkJ+dn->[~jo/w/ Qz@g[szDv&ƥb;[rlg7^o9$.j~PG:F([z'SO qP=c{L?0~s|V[aBp5Ԑs·n/ߋyM5kpۄKsa?Tn-t/^;ZY5IZ4*L?(Lg*CV2Z+OFƔL:WIZ1u>y߾<~D£N4`E]zO> bS$=c3V+"Gd*zFA:1)`Dt'1O}FQPBIIzkfa`jA> Z0#hzQ]+C n] >B"Wx .]MF-C{ XlĢڞ>KͦsJ$r?Vo!j2T$3y7o .US1<:EӘ G![lt?`hZk Rws`t8B Hѯݝq5?> F]YhwgB`*'Vaf~O؄42 aD5Y[Bo?3Z*ϋ @7 $)1BÈ3LbZ[wi[GzC\g#NxH<3iup& em"S1/B.F?J!S&nbmJo*qyۛL?o Ǎ:G 7̴)yc|l~'l׵뷘5 g@ $6t AN4 ¬CĐby7 mfRM\~ #y+AXZJ4TQne?V}M2B&we4ZHnsVj}U@*My-癹ӤA9I=Y+V]ZM]Xr<`D\$`M6T#w$ ycSj1p NIλ"/F9 H+f"R6<щ=d|Yu><͈1yHg6Uش'ʞ{E_@GK!: Ɨa*-@D4l0ꗍљ`>r(,$T)&q}8Eomzb?zud6͂P u$gv>}n3Jm$߫g2"Zwr¤E%B&_EX$i#Vm`aQgQ? HXTᴸ&[KMrFTʫ RǓߵdOD±7W_qE6kGsωHDyRLkVYǟwX*e)y=Zulyu+W%X/ O/Nz/v/U*Ug{Ai/$@T`*F0"cs)i<@6 2UQA6J]/i E@ڎ ػ֊x;q616<'"ItI]R$JPg[{7rRyv\X˛ƿ-EV6df̼;>oxR o<ݿ> L$:&&? }јGcreŠ7<T@'UӠ=bDoLtm,X1 kaO 1IHݵqhiEAiF[%Zhe/XpX_ΚWlYxHuffZL1(0"kZSREan0A%=m,8v$){SH( ERS@F5I◯Q#:OՈxnлÅ#rfYh&c}{CPzW:2*Ge`;Dduƻ_{[~( pP.za=v#{uS{bqMuP/Ql^}@^r_$Ԡ6M Fǝ Y¹`-қKٰ` LDO6)mE p,6M$8`v@e*(80pe+ͬ hDO!m^b>ɗ hQ~alH '|_wĞ oRU)wJv$ porlEχBNdY+xIZOK|_HPGS8kБ"O=7#Ko9qނHͧ+Gb-FUUB#PW/$m'ds< 9-Qhf Ek$D5H m35 lUx+~B1a頪1E:h feK2XbV-X[1b{J|Yo v 1KӞ $Oeסk:$k)r!;_KS =\a)ЌBɽhCN>=gk$H^dmֽ=v/Af_ǿ| ݀v8g]?0 ߖS6+̿/߿]9 d+ΔRAfmE$l\?G<¿M3?T:[('R‰ *8 JeeY`F_3CSﺊ '7Eɝ+P hΔ² ^˅*">0Bx. :OwB}s6"6gT9eŃ];[J0 V Rs:* Vet:YCm(-Uy33mJeIj-K[%lA6xƔZY2#ܺʜˉKq!2 smarķR]E!!jGjhmA(rYWˍVBmV菵pZ+R奔W9yв*ђ ڡUQRJg|=jɬEn XL}xUY;\t8!zHpM|z| g+Dw8lL# vz5] sʒ*8TM}u p$TIS}BmUcxěU4Sv+* I0xK8SyBp9u ̭搉 Ed ^s"Ť)fcADbSǭupHG9d L!:"%"8"oLC>xx%r0ʠ}hT7*9e#AǼ48ŃчC,|4uskӼY%f'z} ]{? a*4JQixmlnF/#)9f 8C \HZv `LDͱ(m\OF i0@5tV782QHtɘ.j1^)>:J$-=ʂiT&yWzQco4X_nz(@͔!]FK (&a#? KˤGOg C_ i֟o uY۷*J3+_-zuXj( Zx ɜDI?AEټlWk?N3T 2I4fMa0ة*4e>wӖ-.eE 2%eZL2Ӗ]i$`0"GIbFɅrBf0IWr꺔q/BAs0/oɋ!-y?A3޹®E)2t=7ǭNM󶋵+\z0b{L4= |^Uvx\?ZaӛWƶ;Nr fmȍ١0hc:5TJ8}Dž=divw),ɲ'AlU7fRQ%F{n@; -f骄(^SY>HޅRbzScПLG7Q4&C'ZR8׳φ M XJr P93=N(^v ?~g#;Ntm( *'81?Lsp~kCVIBy{ߕ告I~ YHe sLNH`"VRJƳt=6X+5vgR FKԯsn. x)Km(̅rPQlH[uRU28,6grT`PT(/|]c&?XNEh2/:?| Zq<’$3J$1Thń4dQz1[hTDhku,RR듁}k(WFrA]ȯxKlLݍܪY&,H`4†f6"NK2b5&\Fq,܄i":KX^trL_((ѬkFqq 3ο@Q@f|f lѲ9ף|1㿹hky K6V4e L,ߜ|-``91z48K oj$͕-uW{1 aL{ErnV;ؑj$}g^Wj]-Fla6K`u#+})8*P)PM-TYTg, <CȖ3m+ɡlk!aJs ?lQC(L[ pjdke{V3j'<ó;&N)6bRW^J񃣸~%*_EcJ`5^#0-h.nG+<+?ZkUzߣӯ%[d{Čr%i?"iljsp[P0<]zA=:m`Ɍie_x=·'~X~:1gRȳp77p1L&`m7\z5PIfTXc)S~2% V(9/nfXPNNl:qaߪ5fIi -'kL:Nwu"/St"&OAa^Ͽ82j͞NR0ʩbGѳ%%ɖ5f~ɋۨpζ+Ir_+#T[x3Jy91;,.Z]2dvG~$֦elo&<1zE{#dc԰ tw0o lEY"%2idK_RԒ,J4*h,^VUFqd$uhG7rs#={4:6ve:[4$=*``:;ߜ&S]{<{Mg1umZ`&31RBqƑN2I`bXוּ ߾__e@%NP2+s]iDuL iIKҋws|gMr r=Kգy+z|zSJx 8SR1 TbpX84,}_k)A#O.W U0z3PuWfj?q:TϫvLnEχ麬>bo ~$}ü^pA2WяuZjeHhD&`HƚX /Kq5"2l(G6M"l5ہR0M ;R 11}H0]Ӥ+e`ם,+I>.Qf&Q]#uXfr=!(1y[wYf.no+Րσ)^wܢxMRw0Y- G 0Sz4:Z,ްIE7Ư/$|l$Kmֹ`: o.['uW_6ySۻ 2YtyJeZ5(4#DvDh]F0`H œsr4'$J\s;LI/_:{b dP\x 4YR2gZյ꭬$$ecեf hʊbU@xߵL(TjI4CF> .2i1cXJP(KȖM0p~@Kͺ1qмAa$._7qќ?{~\!zp࿗?{-Z-c?O}[Ov*#{NGc;A)=bX1 ~oۯן~TFL_ӻ2aa0@`|tՙgU/)v7n0j ݥ,71[s>D:-ۦXF{LGg'lξL`;QMh2k;5 n|uv >ŠiOiZɢF̀< -|c"cbR4bl&Vs%tIZlEr .@ؽp('- rZ+Azgg8c3=:o#aXev:sW?hGک ƏO flʭF K4(W{[Z%K8WDS~ Ip`b1Sw*hUREw VY(dBí"z zxBAS< m`N&r#mA]fa+E T4EY*ЌްFx.ͽ+Z"igX#X)[0I"%2Mex^شkB*8~?/ah.\-|9dS] ?L(b]L]4(Y3*h7^ԉ~Wt WNEˮXe(ŔG EYFͰ[e`WԩC]+.bV :<^(_ RֵAdj3y=GB6 u}dt}0ܙ}#զ Dw:{b xKW^-VI1%(D"eLLŅ7S1s3G\=H c~.$M0&ncz9$KaϙQ&$pK&Ol/pŘ_<Ϝ!)4,[q$b]<#몹cvW;# )VmUy5rC*h}`i4xbW&On}bil1`y RLKqLQ(EOy84|Lٻ7WyHuއ?g5Ge"TwOţ,. RHfdEI[uZ٧I~fDdyQ`@T{\stxQ0{R8 踤=[{HE=YB ?Y ~X#Je.1./]RBiS!st4(;]-O?L'y+(hY+@#T@QPlɐF";X⨃ьkm'Ab"aţch+6P&EF%4:%6ӯ 1_B*Z<$+4%(z"S#@Ft|ܝeQJ.]>pK/uNA0(AD "eȼ$/ L"~Xilt:)֔=E=RJeKTv}Ep i?<^U0pĴ^t1?G]Hu3Ӝr`RՃS^2m<T*Nt}t8yDo -5>M*6(z4OҿI;1\] }a%4= ,W|3^f\7g|;5bܔ}DcnO 2%QBh$~޸*OhH`"3ԋt|[c1*S8H<">XU]yv=&+9W5]8_B/+d{|fhG)CIGh(2B9gV`c 9!icfcş{@ThÔи+(_L>II̦9gC<~Jh|pE>УJxF1h*]gN'%VR'_~[#<xPtsY7BC228ctAs)5; w6[ϟ&B$)Z -To|XHaٜ;E[ >SdYD:exoRhtp5V ןtZL R?Q?LR檸p7U -߽~P#)9KLq7skp[7Ԅ 1e L!Bpx%C/AߋMR\dU !Iڅt7.eҒ|1FHiN=)")Q:/lZTu)TT]i[vU\МߩN n~]`dǀ,VN:eOOѠxF"&HβjگVBT]&h@jvw.$f3_I$KwSt 7:tfG0Jlp,1GCvŶf*HY%4`4շo/ft4(qoWd,:7{JSBs>ϑ;2z@i>=PUٮf$R+,R:ܗ1"2)yRMs"`P#6nALOS<"{; iOY?hɏpIĤ#DmXCt!r:pȰi#K%Gyp (!Yx(i)4KC zl%Hh25M[BSxa5eHxv^cGʏR2)TZGx̶Bhb(il __Yq`p3^S!W} HfHo6}S}10f2G>W\6{,Rɞln5Q*'Pt9h 6avM!9I9%1x8/c@HGm !ZM.vWv˳+*j/y,Z̅Xj.e# 1C` ^j͎fp0H+t|ZCHHw7Dqz6Y8]xk}£/Jht3捽`*0K/^B.3A?$\`W@ 锊ݬg!52ZI1ّτbeKN "5(bf@ 1ETB~4^((J3/JT ZP-~6ac1(#‚d*J'L57!RR̋~c >=!%1p7F։%U|9D?HTb4F(*:Bච̲riiz΅{,GBɻV!UuXл -!d7S ߮IzO_(G&;A:HC=f;*z ?>lsٴl큊vʙ6+&;%z{G SWlc h'l@ : ZdB/؆X! 7NBG3QT1K:L4xpbB 2d#?shQ=!xe[Rȷ_=Qzm;f@-(&┈(aw3ⴱ{x[R$M m85M5sl,u(#&gD^/|يmt cpyI6W)2o"ϜOhl0qXԞ)t?=@TkKV,^Uz)9$K9 ښsm HgK,x3d87/?ÿ'X7e zŞ}Uv+kcHD'jp*[efi2 Wcbɻ[eV)0m+VJXNajܜ~TGGؕeH'V[[3=2*UʘȰG%4v\qy^q)rv=.6f}6^ JYUy+;b"%T;S(p28'[>- 2*UMiS\;!(;%Ooymg>CdfQ35%EIJ4>APMk:T |lsEZd'b>si>,5vi2/χ;)?@_lGцɍ]*4Kǡ|o.KVݝ̺UCaMD`yj .m>U-󩌾xq&+?;cW[x-zw? ; ?-k~% 3?%KڋOh,|6MsӟzM_ Qϑmy 7?z۟Q`*i{,/&>C$eÎ8P>LJ/{RuZWRP"kp#s`-E`KY22)RDH QYYk3X!j*.BJ=~Eh542M5 z-&q 7ƑnXJhvSdG\S\O?UPc?Ȧ>>蜪day>۸7>}QeK o*yd҆_%υec8?@BJ!Qp龙:e=bFbr\(H7PF 7m9ɴ54He?Ɩ -P$WB$< &ϧ8d,!"㨰n!lM|&힊ظ"ʇӄ0E +Iy!#GVa$1TfFce-K{p%`g^(Sv sM߶*7_v@ q ÆKTK+KIQTLqOҥ|"c}Du~?aO#&1^V2`F^o3@xeL&DI2zX%.%pCewap1a9KtDg %J!yu(aڞ8x<];Āe"=}':?lui%#CL2WLj$}LE4$ ºonaܺ_~%F}&}8yuZtsRlE+s2M/M?BҸ2"ݝ<Ӭ)%\`|nSs]%sR(D rGG@Pf)=gx&p\xKQ☦UG_st^HH|9N3Oc-=Q~ `=C0N|pzcGfKzzLll(׉0.E!Ex @+ wqDFQe/VU;J$1M FM/#׼ (ec[UÿDUvCtZގIÜSG,w5ryQB^< ?]P<#0jqRQՎ]KhI.Jh{R%/i0>c`%mDK{"V{)+; ѪGZ,uYh:x{P=sl]b&y.9f~RaޔZk`9Y\ځG E؀~=[6Oٳz{itRh㍾:=N346yL@tS5a2B Fڀ2,gX2TBc.mߋD ȗ],xoHc1eqa-rH?Q>FǏI$Į!*żXB;P-uX$nZᓡ-)30< "5(b?1lI ׊[h0_ 彐EiEBz('T9ޕ6$"el> AUY`<, &CKR.o,ɔDJIeFw$gq*?}/QT]rBajUE[*XP}B \&XeDTNc`uM*,Cnt3|T=޴< [!oYHDci)\Sxt{(ݼⷦ09"Z~;"I(_X2:/gyglF%V=Ep[vEƭw p)A&M )/m!H+!uГxü/A/U D lZi)0dݍw:)7R$YO3Q hh0N6Jp/ pz݌!}J=Θ8s ʔS~Mޡ{JI|boB1R? /g.">orn/ZDHELvYV(u[i:B'j{Y &w;3CcX$s2qFs65wRHG/р2W`A">9(RfCr`7zlmizw3`&U+"϶?3`:VQuMRo!R#L2̬㖀F/4E>Ǔ y^o-\}}Le܃ (GI|Idi *wa~Lsw4as's;~Ɛ."]W![yG+(Z6*EWBStIIu?y(qUc16,Э8VHF][`zКVKJ-zjhs:8Qe;pb5Rrrݹ .j^)z$l%!Z ,E 7D)b4Xw}_Z ̇4o QѤ<58{^$Ơ;w} Ţ"Kڴpsj+j[eK!i<*媳7i3w8  J4s q_D@2%n~Pjf}(+)"1ŗ錃2 մV&~6{|{, c^`QBa M9Zq4M-iW-e1JP.c^ 咄݊u :L LO+t}(00Hq<9hR?D?ORHGK6Oy~ e>O_&=?{cJPu'Ԡ>]GUp_dy vQ{c67{[VM) ']T9Ϯ^WW}|E 0%Y n {LHĂC߆3;K`3&^,OʼnHl0#x8!iRp2M3OTc* iBpFA\^r)Z;8}/Vs>N1yP3k@o1אEyǃB B.~:,"a(sp_Ms@!)_ .F^ooUԀm@ZH 1@yt(.8A[oA`RTk12O'YU ݓg(9^C#;;/UEIQ@A@cªFSf吐W^щvpgM/4f{_]M@T-ýj]᠍$8V=??Gio5Ұ{!| 2cC\KMQ0A2hhbwwD 6 6"BxeDW]QPλe\uЛw0oJ*8gT;׫tcf=Wy_/'X+}'dp#7y#S&Zg.4{GعYz(M?03ͼ:˧GdLH3V)'OnSiLb-H'wcB;8FdR,QvE\YaNCeP,uiF3,zomΈ_2eP8.>{ ~rlN-MF6\ѧm&~c`(40Y6{r00 b:F›Q7fkB&HJF@*SLX6`蹔}|hDîp,SaUQ~=K&_O+ %8@-ܩr ϓ2PmMW;q+*@*އNUC.K=cƔ Q(8ʨ=!,'}UUP\{1Zrޅ38)J҃xmvz* [+YI d)gԡ9jP*XyJ6 W{a ߩȗǐ,\a\ϱuXa061`uF_҈`٤&ŽEڨ e֮c̯ס|qJO2f%RmeȚ[T~!Ri[kxM̯NOlb#n_䇢4m~+V17Zatu`*u\yݽ 0w/F;%PÛ&b,^&DNb)ҧǿa0>*ʙ wՎmfsKa.ψf϶tssU\ZvVs;Ԏ.+DbRr0æ\Pn|ӂwT}IoHӘbbvEmX7 Ax_|'޺t x e@\+ Ioy @Xu㭖ңAz ejn㊴̹sNc0?k,G R3g(*m*K>??GP(C̢,YyId|C'ptMcXbֶFx:f }6F/þkze5"{`ʴ,O/ 9[,V]Zdy&^4lQ^T@Di c0-ś(D}Le+wQP,%R0%,<40*Cm\1FF1k'!a9{Gy0YP46gh叚LB(ޡFsdJ||fh/z wgyu"4'לֶ݂CAT^nEa_ o5#64jS$A"hYsAaMZE_\nVuFyTR{'8ܲ)EkQɕjpA.ёqŧ)@`ٍ'l6E؛t8 {ݳDyѲ5o6a8yjA$vVЉZq+8`cSX18kFbZ+7zP6y[>"=xfRLRX c.:i<˃B8:ͤڐ^6o6]8=n/S8hTd_ xi4!&4m4.|*h z%2lCrI3nC=/Uhk'eP~V^pY 0Ǎ!*;a,hjڅ5¾ނXG&`f mc-ߢ`bDLA6JYV (̚c0lH{cl)l,Z%}beR (F 4 d 4Fܧ7rmԲ"ُ|b?|x7~cͻM&oaecn >Tmߡöhp?uOE:#v4[$tz% ٢$GE0gZ|)nɸinLiRϖT# ]"2~jRȾϑ~+rqJ)a?!,WrF~?~"|+MR՛3r>p9 Fo֦)amJlJx@a+FIujTHO{FDEh."[4IXe4A~8W7yT1:}l揫֕.#Mܞ%)[95&iގ<[:Sg4^K 0Fq;e- d4F:'ܺyK-쐗5eoF龃e&-Gs2FVH0HށMn{_Xn4g! :f;6tקim 6ndzr V.Us(2z'rDfGE%,JIӣlC׏iwI$ep*44jm<6<##PY֙"\)X0k/eWrǸ\F =1#x׾7< 4P:o' T'eΑҖN :>fj͞q]%;WѓAӊ:cs{: Qtٽ9wYBϮ@{d,נzf҅yNu?3C97~Ɔ&T'f'7_' ނplG] 1}w!'g[I' 8ÂV5@e/98sX2vANٗF^q[{M׮ mAd~PkM)ϟhKF2h%SKFƒU=i- ƃ1;"j:lrjm=?Al#=FuCs!xɜ$^r\/$BW]oCmk82)wh'2{W,%kI`&Uq-h<Tq`lD2I7/6[=xG[`A[)z{ AfXKJq!tXNF{ (]y``D&_f3:~.&_&dz?Fľe %Np0>rp4]LƵN{z3zi_q?jL0-:f|?;xGF7 .BL@y1w?Ky[qBg X4t2KŠ\pR"lI:梌r횑˻:1okIq>j1y,z媹4L|Q;nvwhף(`wqTƹj V`5`Rʎ#:m+7gnys_0poHi1W8 B8`yF0>P ;k4Zk%:=b^9h!Lى$*Z r}RATee0Lg]+YTA*3ׂ`4AT YbYf]Ƣͅ٢YHl΁/EdK3ř4ϚARgWfI7xr3^Awxx>Chr.krQWY>qa=$%o[ʵK@`r)-j akhˎ]xP%+&q S.dqEZԇECz}C!|]p{mhoڐiDITIMHl/+8{c̴$sS ]Ҫ]R7Rކ8+@\1ʭs,7:'Cq- *QQBEIv-Y ɇ?oIN?7b\x>{O yɴϏJŝaFtY폌q >QėN#/4%JbտTÊ829%hj zAn$jP a6-Ҋ1By!i1}s-eS#)k]agߔ.)10cP)0J49as8pa`Q=V(%7p44<VyӲ(ƅn }>JՂ Dg*wxwڍ0_ ֵWiS-:BJ+N= P>T(7"`1jYrjѕţE??q%<ggt}drB0_0ij4JaYAq89BHBIXc0>icA› bpE1؂J'*p8ʬF|7o0*s2pF] >r:$KKiԢٜ\DgGԙO?@@_Iv/xiYDkqk#A_/*̜ Hp"b_c]Α]yKNҊYWlRAn@vl+Cׂ쐃t0 fNWjK|j@4hξ)tʶ:LjhlNLrO'9D'-T mgR,*Bcڋz!xӶ29gVC6Q9IYXtOk|Zs̢ؗ !Ţ ;|}ù_1 {bx`VfQ|tDjƌ>ji0OGȍd Sr3bNV"l]W`QiK]c\A 38"%/`"Xg M LtRf} +M62L`#Zm)P Apv]ESz:60M%ep~tzt~|]kc0D.k +$aԚI"k F i ߀OگOkFxt'Lxmu&JPD=#e ֜{]R뤨ʲxSbFi+҉X=2h& ( qҮ 5mBD\BJy)Op[MJVl 0]Y $78SP 15M=j"HSv uaҔ6_6/#cBkF,l^\oEU-IlF9YF5N2~]Je? a#|`WfKQ!T (6ruZDqm=NM0|7'h{2WFVbfd'\ Yvc49-zX[Ĉ/W:?¼tHtƲܦ_1:l=bC"K d&)P ;_Vé@껈LGeDok Gk 6#(fZh?}pkG5BqW R=oziz+c]/ދ\6H[xBZ''nAn< C&5Ӱw\S`ad5n0("LRxŕ-1&(`)Wv|>>=h !*tޤCHv$'/Q] #bLJRqpz *Ǘ$naVtt`cU m+N!޲8@XgcI+=% :|ۗӕ9s`:].?^]h=g8t'+\,O ?N@Lû+oHU1Ck@tqWs~z?}8=jÃ!fqesm{[7E;w%y.}նh莰*9oW[@ŋ!,9zޮ=ĿTw>qeL1>+FA/*,0?0O\}E+JĪNwTJ"CherZkN?Bܞ3&] <2y*2cUCvbP'#{m!ypw^Rjʎm)$R}u >c7fH2|N1BRvLEt9%u @JؠY"i~n}V[4gEt#kyNj1c/|8g$x“etϢ\\- mvR6WyR!|ad&{:y,ԫ-Xjy MS/{`$}뗅/A]owF؂eFM^kRcmJXst|HvE[ x|N &2~e w(8E[ Zρ:U=_ 30~:ISQMi2ӂ=BzLP;3)Q-57jՅ҈I:5 *lon?9Bf<'^|uDuH5V5bLoW?{lMz\աl1omS/4ȧ^|w#7[iU12{s<@?JS˙OhDvwYCP,:|Қym3>p*Փ⊄:U*_$ʄܼ\}.Sтk3Aloe>/~hf\Fἲ.I7r] FDCNV6-M/"ɅFV=zgJl'84n:FM- I/MVzI~wh#ZLjw}XLF/hJ0 iY!1[I<2ڇecmu2.٦3/g8dRI}K1R>^h}Lή,IDh wF\ռNyc<ϰٽy2vV߹RG7>RpyENw}rSav٭9~v9!LpnvjS;tcϨ&gsjڙ}Y3uGEF;gc8oK wMb}yox@Z@yi 2J m!xyo`pJl) Ӳe+dwR|+Z?ݟcr&k&{ɛCkrLڑߘ CMDk!OEg zb,t̒Yt4 %(QlxNԌk-~ϳF)Y+/xoVW~٫oix޵-0Ɂa+ N{>X>4ˣ>i#ʖf4rQo!.{ySM2ZX=y-%]OODCvS:#zF%  XhwYM8t'2 '"Y%)Ъ#mQ n? -:O)xwrÃpl.^~8}_˙ҐwjGmњ+bԌ({-8Q(> 7LO %?5'(qe}S[fO0{",:s&ӆM0̛60-ô1aw˼4i4oG7 hG)_zj8iq}OJtV&t/xFۤRAdZvMEogd+ =VΝ3%=2hK̰/*z?]:ۙǯ(O^Kޞ?kS/*VPj)UTz ak\6cL_ER8:10#<8X\fn)= w͝묾jdU#Y}jVmwf &(r ϥ) p~:\ˈ )6J7|en]0w)ڌ.=4fOTލy8;{~9;[.m;+ hJ4'R$16ǜu5Ƣwݡ;%ȵj8h[C 0VB]l0ޭR]X#YjGV鑷ujBIz4/FQ]_49N xR+HӬ+r`}K+8h1dMh`iEأ_lSMÚc&axkpwoJnzB^ւ-fA0l*R tPRZjW=sͺfB_I9|zC 2%#(9R)hL[oCqcs26Y3 6T`ML 9e I>DoI@go/5duF:UgT5lF)!t%jxH($g0`XxpfG;{{2EJs0ss_>o ǭ]"T uC}P^ǝzܹsgBOC5M 5*՜@]{U KٮCzm(PTܚ\ʘcĶ' fLgڠ" %gRh8cpi=&n-nRl6QsȜr㒱˖&hcݖF 's,qRyfV#h$>WkfyB4Eu̧GsFaJM֫^ L8Rm kٮ7̦f67MW4mlAD0R:d@<7Y8 lΖ{ꇏvj/3%JZ38{a,{mٛ֨NL6'l, [J#HuW44N/7jJ5QuI$-q$o7Ohx?.#+; Tଠ.9Ui :j `]*i&_O!& HunJ}d?17qy(YIÇ=qx8؏^?]4a9TC=nx {:Nu&fg S3]7!n0x<h6^Up)oY7=W2lISr241ڤmjn)"r)]$s |XkXRNlf;p3dž96̱plnI3#WjԷʇ;%-Ι(CeBG |6*ɦC'a? SO;]s+32kp0mddՙRI֩/&>‘)oǙBO#8L44U@Ҫ&h y|Ҍ4J0 $>z8li%KDcpd}|$7CSk 󝛽ys!L<%3~w|R#fU,l}=Zs .rYo\ Υuz m|R^W1osal=P(M!YW%*s{z ~<=lhQ6oc8(εc`Av/'}MӑxWw2s)a,R]UBZɷF=P#)"y{zj6{M_|7k("%߿.w;rKjBe1?B|f* 4xa d,%O,ؐymT Δ>³6޾r1}E22~EOԆ!:j9Z}.^nYC:k}5 ;PSfQO]u芳\}ƒ){KK}S?R+%"rl/>ͫWp֭q'+$=>g4\5:낣JX,OpKLX0w@_s3sВF˾W_]+ ){0Xhgus#16@Pqt~$3ˬ S6 PF­2Ju@U)S >5G4ɭF5Un1k4ػD3|^իs.?n{6>4 3}6L `-ʁJƫy9dcxjk]Z%a`i|SRY1"{++n' 8ʊbɾhP()IAp{\t`3cS)μaw;i(0>ox`q`#LTrg|/ꙃJ<1׿9d_/9}#|ȁ4h!Y5D@ )zklP- ;eR6,)9$aijnC>sil'[ڧ}C8YôyAS vҰ{,$9>iXYu@(S3~@T5`8;it6===~O1ӃYnvzpUR`8^JǪւ#`Mu/A $|Bt pl^J7*>ǀt! PK9}psɳ9T}iPA?P H=mhl}Tye%KD(%GH2}4c#c7r\бgupzk"qrBƩy擱KMl#P]2RJ6 Y 8Wrx{h6w ~!5%=6ǃ O i?22迫cHX3s|ӦO24`o:0?_CyCy/y6GfB Eg EȟOynxI ByTa#k]/x=|_DφK)&Pk9ޭrHj0<{Tt\l4L]px^,Y9[>qҹ4%0UO6pL]%*s.&2`7{O! Yݠ`29WrM*=yK*ÜET{o-ط9:>m'R_|W "{0uiu7rmcۧ{~'s_R׋r[ K߫;dyduw^qٚ̋ sx?jH&u><'gZer6. ޝ͊>q%A^Cs CBJ[9 UOi۔q爜[?9l:q[Σe=ox8| 8?c\bε^uP4(Cо6l5c# 7a#M1aQ8pO,~HP-A&axG3ÀÊ@F.}᭶;JEti}Rs)᮱:K,PmV@غg((k+ԡ>%j(H2-A){ mv*@dX#)f>p㑢@eoڜX"` ’VM}[ B1&F@:<cRp~wJBWࠒFK˛ߏ@<:!ZDF[Mi BTZγf8ngJo)FlO]H^&5u _~R/?>h7?Oe/Ax]j'=\lgzǯ,:4_FWd=7FKߠpԔ [5&&`@5h@h6^{vi-ǘ>O )UC䴸Z$5Kn:[|o rBpָ/G1V[-yvU2jf];c٠`S,VV5b]G9(I"Yo^ 8ys9\ՏX)9ivݎE!aXm! {6;[S!*gt\5FQ {R"zsA^U}o]DGt*S?_m_gq8DV3ig#Ŝ]i3~BHV\':N'PKu@QӼ[Y﨧F:cmWgwasRц|i]D2 I.4D0@ba4)f |ci.Φ"bY ])VħQl5OG˓YF/- 8ө :f.)(>ވqbʀB^xg?Uߠoέ{jB5~Ͷ?`x^Տs}6|T3>/-DRLFFh0CH2BF*2ˍ,hc)BktVyBh3e]rf%n-#{ζ94oN q{ˬ.Faof4 4߂7~Yoۏ9+Ws9y5XKJ(M|Ca!_}/[f1$Y;Q&u. MP΅<,{|l( D=T {X\r/ҠE٠aH3 ̢eAnz@`CoYT۽=?~ WF41(@wBbKJ`|ix3H,Wܱ͕77A?7ٞ?7y4QP!{hp[# wtw^^=NٛnW~=﮽Şo D>Usn/ |ISm- %"bK =5űmNmIERʖKS:VwתXbTƈ?%JYFEمIZp*!2(*ms- ..p}9j!&G;v*l0m*o913a-yA }g˫kq=>  2?oG٠ [2bO(gQӯh-jj`Ee@Xha^V:' ZR76$eWљNu Ϥ\<p0XʾKÊdF<`PQo0mZ2e%W=%gvOi~64v8[sؒ%<.7C()(z MF[b<̐fЃm(.(&XjmNA c l'*Vg:jyfdֽAK6ݨHƒ꼃Q_T:ᵶB 6'3d ki\ֽO-͒px—w}.WO_>A!E_xՅGz ]))25ζ(K!*QcB#N"a Ȝq qn<ҲЙvAh%SyHqBX$9$_v]}ȿ+a?hZ 8v rjtq'+}z9)v9ђI*(2{ x+ȉ9YO% ,x*}'JܻV>*`RI4=Ӕ[Vi¾)QA sșa$=2oF<S9g N 6cO[1NL)\ nuI:CR>s4x i* uFJ(_q"X?_U.#vONJ=Vo}QvJ}MWvȣmFق*9z]ENsz0K8sYvz(bCxqSYǫ֍ 'ToDpܴsv|>~x/uLmB?F5&9ϯs3bs6~o*w2D͏>W)^etjOrύU {579DO̯6OD=!q|r,* I2hl1[h to&Ԟep;BႫ>^K=F%zBO7o94*s@6),p|i $L+(&/è U*ǀ횹tp[O펱ohWh]B좌jNm)h{cڭn έ VhwQ(+حnºd9cDdpۭnvLr]Km`ݭЉ=`́1VhBI v ? )GB5}8gHrnlS fN n"BI猭I a!Ǒ_>}([JRKe,@ ЫÇU,s: tJÖ]*wX [B"Jr/ZN#dL#ڿ"{RCFA[\P pBsLd yZ"To߿ZgScN҆|i﷤I I"T Yɳhm׌)4ZpB~ekofH%/bqT2Μl9 Y7~q_,g+wR봗bJ5۾oZzEKZus #ݴ_qZ-Βհkij}FI2!c`*n}.~aFG[Qv#ڢG<5lա9yKf3N@1wRR+mJw~Y2q69 k59I|Rяʴ= TG˻}|/^@L3H;hwei>=@wi͐GRIgM3ڀA$#AN̉c\p:t5_6Hpv6xK9{fv"Ė79,ƪquk-`AU07鬷PCKZi qH[̨9\AU会xr+Fn]+fЮЮ^}v)&\y4B>gcv=v!."QsGy5xx.BFAX^ej()*k""@=.BTuv}v(ć5+S` sZ#o$Bv { QjNC¨ႈ赋'( " CqHM#9 "By5*%#Ю^ת#=hi7ЮyJRU-.BT2]t ]/·7I˙`2 $D$$ڧ:QVJHt&E$ʦtY@0[:x) -/ %1PrI\gVLģ2v^K];6K;Kt i#Vec&LR`"VEq(+fP|,q>Gk7Z5MQc[([ΣH1#ByǘBH!K mĊ0G}]&oa6>z#" P5ZG$"f(k\AQO93iF|Z$.Ow%QOQ BC^0=*>=9S ;}ߝG}R?; ڟ=5aяv%/[Ȣ1Ú}tIk6&GGT8g6;!!u 2gs zg\JAIwtg'OT4}_U!Sr%i~}wiYqQAkYV+ i)sFJp6IBm+LɠuYJ\N X"Eex)Z V!28 dӁ< fa4ѱh#-}t>j:um('Rut` =C*h=*8.c@JmQĉR)J u:E4S}ENNڭڽ$Yq"{r j ?$⏍ڵoZ8ZE٣>HD'<Ź8_홧mrvoJ'tY>-2_]%9Qb y3?lWLD߬-P|Ayog{Wv9B}VqbQ޼nAHg* 2z"$ĕKTַkz]q={K2+^j*BLì@~#iG>/|J_FJ6=w jTRv5ږ-FQ4Z?(N9W,  " B)sEtKF\sS穦Tn "q*9~׷[Ztxk* _hؖݳWZÙމ%۫=Vkۼ51 Tڠ."L͐")g2+?""#j#*G OҌAS9ϬCm 42yJ#xcd*exjzgJQC}=NgߚJEyq*ƃ|t{=VI6ѻ]n^xߜgsvpJ?=Ogؙߝ'Bh->;8q` YTRߺi+3vih[$<]m6+7`w]wr<ϣL^ %iMU̗Vڭ3ZRNm)T/ pPh%}F7''QR"7_ljۼ*2e&,)  O[ʫǽɳft܅ *9ٵVܔGv<\#f&5f;ȗOhl{ѯU91k4SA}:9{a)04ShՎ#_lwԟC>3~,'sWRGu/nK?]ߜ_u!(ӓ'߼YV~8ξ/gZ/nt$Iʏ'-6|.u2 gUb#.Zl,y_'7g|:*ʵ2ն0_-J>_L\+̄-8< ɶ dU鼴6rEM|m FIA6bZliy/a ꍫ|JTVb{U愷 &)4{ *PrY]y$3den1 ej`cYSfeb]zM[M 79zGO $ߔ KnА2n_7Q= G(@ HO&ax ˒}ݠ S*&,&ٵ>85bT|skKshodWo+Wf-""mĹPRJk>hXFEF0}vuL8czzQ6jZ3 p2`=+MÄuy0̇UN;ᾤҦVJMсM=۾ŧ|4H`+lu,Qwƴ85YlglɆ)aj:#E(mήUv֤7eyH.7st7վϲh6Q5L?^XN Ȋ@3kJY1#$\vs>\'sn WF&KuQv\^Y;olw/30bh{yu8svOE%a䫡#|W9v˰TMeO ]Nަ:PM*O7{-b6H9Ouh2KgY0=|)Y5c2Dkt\/(FZCufT_5|`-%41>$}ͅy7y[䰯ȭs^|Š$;HPml$:ur:'jRZph1Dj U ԭ}+Ms*ٖiК1kx } S5NF'Lv0.4ĸ(f\8XԄ`E h\Ʌi QDn?޼ʬ%bv ^s?v2b "24Ruѯ+ϹyƩkA8cI5e\ӴB䕭KttX18d;1rF{U.W;1T1B#A-xNtV\>$R(9o>>BRP1^UeKba~sLsJ K"٤!S?jPSWHi? IIڿc2(A+:O?&E^HǼ~B^Y5/}rl4MY΅OT%myKy$S1 4 1Wc!cq0fm&->,83mBI f ]wXZ w 21ܓp3C-싋nHg].G8LLS,xKNbUW9K9ڞ]bF3 \j d"%aRm~{X鼆џ̎}r B߲!Fv<ĨvlsU>`؆rpclA;=EOMH` /ml )fXJ1Baf[]5ƳLҕ6"[j}5Zmht< {OFEb0>*RL &d? b0J.k#.8b}s쯱0dS^ w x]F < ƠF(5!fpe&kAg`LtN6ctb3j*C1\`Dwtp$o[n9q(A\r0F`^QP TMh7JL]a( b>|ퟏ+5:f8(ܺ09I,˔pAq:Y&)8C[UTD-&ťT {AHH]ESH=]?|R8J>x< xh:pRL|_+0I j%FYT 2 ~m({܎.&̟Nh2zq[7n4Gh.K%䊉W!:O4lb ɓT9C_&VAJ҂Uҏ,,%[Ĕ7yw'_BU'^QQ{o*.Չ_J 9Qɉ)MrY[8Ƽ{xORi:22mbT*yj-S1`BPq[g^:#4M#ܤ0T'ht j2l|Y.2\8 wmH_16ߏp0{ .&c'X$a+vKvK"-v23bUz+qNiJS_b L1fAI0_Ε,};@ve6,KRمdjOUZ2r))n 2dqB>W7Ptk`V 5skZK`+B5IMnGf9?iPLJ<5e'>8ΰng5f4zqeYw(΢|9MΦ&+@܈qޤnw Z%+~8qn8 %z0or2U4dog'v^?/ԟهE{8۫<x7\I.)?'(|a;#3y~zNwU05sP 3[bUbPA弉󞎏 *K)/O`#gj %ۘn)䝀?Af!&!(S,b.e N՜[#Bt0 %3K?5^hV C쾦)G?@0;b@Cb`u48Kqp)n. åaR\8N KUn^2BtQQ ą\ǞKrDq S_cݥYZi]uAs.<(FSBA)5@ +} $ifFagF~0\YJ\7$i]k~5 r)h!^rֵ X+>{Y-4k"g]V@zpXJlB~o& FBZ[&7L#k2( i,E.نws*֌-u޵;dpsCh~VSS:{Wfڸ/[s[a= ipyF>0/`;Q V9vb@Ce_ڠ1G`6Vc,n/[dMOLoInK_M$&kI+|30eTJեs=Rc19fz3oEQ!VqtQSϫB r<ͽe!"?`pfoqZs|l:K").,ʳý{o|AT3`hAZ5pFi=䵴 8ÊUI>}.瀓]%FsyF\J*;j28Da?ICSD B˕W7Z͔uK|z0 0 oz3$`ZC6 !qHhbs8L^SYnE(R_To򝷁FZz0A:bXAg˗|AņHՇg? !؏KCP%䦳W\f15Zd!:O' fai-Qw=9:wߟyRGhPguqwy8Yh9+Pfz]\Q- xanlT&Msrp™ y>$9K@*η5OA`ξWmƛ3++IFKb2 j aןWf>wn 8̅gTzh&->ۑJ{P֟x~y=ͯ5weNOL}4izkaklc v/.l,*rŰ%7|wqp5aϤ+(w [LrE12lEVZan8LIP-֊U<98\KD]7+_ lVO2mIjHOVKT(4i !(Կ߈Z#9~y&L!=%ŠXQW|bxj^9|3#*|MXC%?,LՂݸhEVYhEVYnFr9.ù5sXba a -ji7zh$EnnLfiş$ )d ح-bԹ2c@mcȈ"Z/sa>& Ryp=sp> E @C sᗚQQ#,Z]h/̂ߪj\+`!NWT D!CNh2Vw!3wM*=śQQOm8yzIYy5 3a#L?yĀ^3rBtz|gW\=ty{;]?SNzUgd?dw$|W\۽sʵԏ{(xn},o\PrݘF9 %_KK eҌ!L2cpt W|  ̈́1Q/3Ȁ%({,2K.d:54Cm1]zgNF! |ȁԜL3uX cI&qҚtbN)0(a^/=w\b ,?1$9C8K%И@$M=y>TqM=oiJҌ;3%$:'JdP E25ĭBu'AqSa D'bM0ařTcPҖ=>\9.}6MX }'Sv{gǖ|[!ʜ]ͥ`JU%W<,ܑ$Xk]le`7?.%d7Ƽb=+nG~$Np]FaVQ. ̅64u]mBpj.>\)C$%Pa廟͝tu\v\U.{ 4\?ie si #`~z]Hxgv"m\tմ9r1P+?2zYde/n /7O_atJ3H7du/#Eõ1~3uV6v*-i蝵z}+a=H m$fr tRƔʁ`3$LęLo 4Gq++jV': #0`| 2dEcjŖmfGv{B1:WEsc9(fԯ]9/S;z;"Lh_ˆk{k ! Z[H~c(hZCZ v"z+j|f+"#v@R EVukTcc =(y ෱U 6i3RQGK9+a*A,LS+q[2| mz4AJj0]ɔڒ=\俍fľ3ٻmmUw I} xh%R7۵=Y仿CIe=;3$LIC0\CGc.!(IxXs&]@F |hWAe y+3xwb,]ì>!w_=Ȟ(tԇ(wB5+)!NBMgetfxa@W%9VAKav~߿n?4N_w3fhrg*+}KMV eJ&M&VJ:8W_oF?(+͆_;s^W%U.|UrUrfD"%M,9Nv\cK- i#@cΦ/jZk>`2;vKLIJ.@!hl2MxcF}8dgΓW$oA}~{kU^f)˺c+Y!C..c7.+(ָ  DŔksuJ2AEÆ"1i;L\p#ѭЕ J `6"%kSkʨbBRj*'"R'戥Fbʔ=JM*"@oS)r0*,!ņ NRĭ1ƀ4X}(: r7SA_f(5h[au]sinAxQL)t a 0q#d ֍YbbͺZ,,gNor-{GWp_(zɧg^xgj]E뎽 ަܝM/Οdd yxIA^HeA!y7:^>:e2ճ?|p(P/|LgZO=yাeTJcwwP s:}LOq39T"R4dXoîNb=kX%W1v %j!]Xzpjve.|̅O2tKk#i`RDuĈv YBpW6#fބěKùU մٜQ&-y Zh.vm:je#bIkm3ȎY,۵J(L ʷnU!ķ'XR.jc!,8uhsKjLAb7SfN2%s!Z'•`㠘 6p]9#S@U6P!Q 4"sCtHgBrL%>,KLd vÂ΀ !K}!A̐KA̰7`ݭP|07Bw^]r,=F\*10gtg j|qIJbuf1n1pQPFz1+Dh%@d ɴv"sAqU0$L!s#&( bg"2N}J e֯J0H*AUIc(4!Ef'W#-Y-~D%^dM`pv ;ڙ>VEu^8c-ysYs7Tlp c iCkAA}\m3yXXwB o!46[9R0!$;uI$)M JZZzEJmhquh3ZKxeW WK% 7U>{ζz 0`Y&󥽴e"r.HYIR!k3GlDb752U[Z$A3U y.nOeu@.-z 'Ovq)y)Tx8Q١Z\L30m]gz ӹu5 ZCZP\\B"Ƒbj&]d4'h8R4,  L\Sa*6& tjǩW :|6gPTu(m;l!1 .&\0z\&f"֚ԑBEg!`NCIp sH#/,U2ֱIĐ$ ĥi T:Qh5-aLnr vW1_,U`Pjla>}C&#\cpmI 7: !  ͎ |`Hntݗ+# åg)k4Ii*'`y{ʜ'ۊ2XR9MLuTX_$>2cMbbXHfɗ:l]1ui,2+iC0#^s-Y({±1PҘ+ 4 ʺZFqjh[͇][a8Hp;5? Pݕ lP\GV$=YuK7>ZMsX h3n=YeAteU`==􁡻~<H2UM?ޮb|9hS*CP*tz.+Zȗ5 x4?GYq:}-hkOh%LfMz-$uz%By2Dv<%e1l}(Q޶Z];2nO?=Zf:{)44Q1Jv i|)v˶[yӳ燕`OK5)s>= Rp*Xmvw) @>)@]ZA r%ao>} R ?N@$`f;c&pซF e6W=6z:Kf_j柞Sl9ݔmqK׳ ×|NߠssF§>t8K`N6ޒ /Rd46_|&AaL⹾J.g{?nb/Y죫ɇ:Jۮg̎.4Ŵw/0p ˮ._K1ٷnrfo*睷l[1̈́ϡF~p|gmv#36q(>? oNx{> L>$?DxrrbrtM,e>&e.s̴|=Eldź_iG]t6LcW|u1 /S7GK1fm£IL =-9b r`d)ldi?UpڙX Az]N>Y[zC>Oʜҕ. H4$BP.69I*MK#=jGMi1&y$VM>~tuY1cHGi aI1UR#Y.qT%V)gu̎TOš4hAw:j0$ Gَ-T-aP wq+ CTT#`s&`1?Z;j1m<6OS5kzTAhߚwfX1sզG@,wmBVC@{_R1]= m,] hU".}b7HcF#4+zw+ŀQ\Q\)%][YrpWVoMD]oÙp+ OkcsߚxiEak̋AJhnlMyA ]):L/b!esDrQ$/ o| ٦_z%{Tȩ_VnYӼa R}ãVb/ U*"/~xKN⇧V vz}1gC%ϾWf*^ yWjG!{cxDWRd莧1yK0٧w.|XHOCzGz _+O`EGYc*j(4årx43B%'uYJ O7 764!*L{|Q"m5ҩ"[n Tل* -Kxd X}uQ*AřdoKְ%?~zے-ےUknvc㕤 w'K($)Qy] }s\vJR5߭ FSEQޑ\Z }%a-'S*!.5s/|z7nUP D[VCܴ˜;Ո=T7r2HbGSX;R[ (LޚH83*BuهT|܇Tt?]ys``ӄ!͙N,46{15#L!$n}@+Ş_ʳE%s} Xݑ,Fݚy#<=jd̗J%5h*C3Ps` f탶-p]}IPgN$CӖw' `$1abqÛiw7Jֶ]>cnu`񎓆#źq*.uSE"уg*" 8e]://WCKEK2#JB lj6KPǼekOJ69:Qmzbo]1,?Z_';W皽yȫ~tѯiJV(ѯo3}o/ג {'.+ػq̄)7xW?<6?A_l-qkݼɴ BU:ga55~ej͙5D\lj|nV>b,{&irjK $ƙШ{^O+έ?dzѡV0vSɼO9v.Pi\Qv@?h1r4Ч[ZdTO![_ M-1oy^m kdLP{YKlÚ `+^òf9Qq}rDi$OSB%kKUB\[A 9RFjj iEJ4A\pz,Jq %>, L`m:' S1`T[$78!fn*= zLhúp)\V"#ef$τd>DR*F^& W :f[ZjLG߶Uem\/ڢ#1ю鯔grrǜj-h&h$[Q_DMxRopPE#77Ό~rTrى]_nFQO8-1OJ<@(0n0Tƺ´yaD꺬HzmN`$刷0cMNaƓtM#`$T.m4bԙ%1ܘe}K=iDWa8CR+M&mYд_cmo1 -&t{o$)ORm„SI b‘5 (VrJc޳)=~xwȒQT4Dr;@jv҅vyAI8f"d""):;._NXQ( ds1*k5qfcQBl/lgb:):墷b=X #-X'bƄGl}z}1~0Ws>|PaT.y=/VyAUi L6Tm UP} >AkGQGn }oϾ |fY۳sM2v4a%LT&u´0 X:kg oYq/raVRQ)a{i?+/qN%N"$8)C80tzw0׽[[c,[[V|}Ʃ'F,agbR$`.|*+z5gVo՛KN)!,^Ne%׼k<Y#^\JV٘B:Bd&@ >Yv=p /߾Y),yof43tS*T 7MM]'s%Xq2L(.S"I 3OP. uAqern6}a:@M| I,`/-"s{d&2 f kĪt%5)bw^OGWAQU8ʾ*e&D)2m$64)p-h,b)oK֩2GD/̦/j0kJl^1p>KR+cjݍ;z|sg@La/d{eG^)+ 6z05V!"CrYDd$YfR€Lp&řʀDI>`1DZըjٜ '@lDs,X gum0s-Ju'aM_s5YX{` mg\AI!1h$t] m~wP哇ٷnl68?hH! )tb`Gf0&Femˇir'Izgl,/a(XslǥXL 1hɢ\i>-p nf;81טkݞ553\x-k?݂Y4sYʜCiL-@ g6X/}hQڃ GQآ*``1iSj%RTIA]}- QV*L,M(¦hp0s;nR!.(R %vT2n2X!Ȁ%^ %ӚZNskH({(:pdh`, w!lDʘu`c3ԧ\y4E:D ?fuaX=Γ lSt;0NV  X) si=b[RBh1ue![0AK5hc I]_jW\М.cڙvkg>Y |VTт}GV=hIaHR(˓h8B.c~4M]_܆DKXj t ]KRgȂBK3I ´ũKHd`4"<*`'%&4ٛ;.Q#lF/eȏ-0ph:MlP9tsUPcʖlz3!Tj҃I0|66`!0L]qnTXkZ "5Zh=#Pߧ^yNl4!c[RnRRM4k͙/N (q40E[Ґ?HBZs9WP8*z2^.^؇eMstE%b[g` itRBn'g ,ʢ29[Hӛ3OP\ź< iDSlU 2~Ƥ0p𖥩#)`MtKm: 6{c&#XoL>WrݍEix# LT!Hc[ #g[ d|;oFz|ʡTQKsQHM^h,:Id3̠4LT 3%[ ]}D`} ^5JEi*(UiDqrQ[PiL-RYGϫ1 H3zâa9tYWs.ۘ~`q%vtt1ngJ1,UՠXN-̈/ OfJՎ Ͽ'gezrk Xox xUmெ@}7lߎ/FG`ЮyL[yQxWs[`.tQC(\YW:8p« - 1[⠺" ")eqTC!Gi. _8|hٽ8 Q9tBLNkSF$.Bq_E(';*?5擛Eȫ`MXb~A(̛_ 摶P(% U0µY>AJw2l0yL3yi9qqvhɐK,`o0 MDci\W;NW+ ye@krM]'jW+g#E)bkԅ./`$kTX+cxͨ~S⥭5yUEje'Gyĉr:/ꑛ/݅pJ𱮸5GaWE5XWvŭdi`9 sb(\hqp\r}=Q\$;yF)47|#4W rIwE}0 ܓVΠ]2O?e]*r<8σQw'~.,xX $b>+|}-"R)a7jjlDH#_"m2mI+*p{;P.$wu~r Z#iE'<U55C@wn#RYc<8)Sf F sI|0u3I"~tZ{Y+֑}ʫ{uVjR F+V=z_v)뉪U TI@[Y[n^ڒz{ǿ|FߐrF|{=iQCț2ݴ7Moyc i[F|QU  KW;W !}xIn9>0ov{L3A79 1eJJ6'H;+J2Y]'k^iG"a&ŠYA@Aj}n)qc9\媁F9H]M鮡M& L1clY.DL&W C CgewQp.<ʩ> RkR vUY1eqbrIU3 V}c`{,)<ƘZI;IG\0fL? ,VuC ^"5B@XDPaA1}7VP~I d,H)Hv~ Q(Z+`a0L2+x?zB* IBJL-We?vF!u.y-&˵F+ 9+*Q40ZT*&Z*aZ d<<•TP[($uƩ]yij< z 0naF;wN2mi(|iMYbR[ B4Wk(Fr^4mˣԊ߶l.oFYǕc<̂*`\# BH2XO1 eZI PY:#)C%E9ݨk ^Fɽ*l?/w֮rn-<, $DhEy}YL {QȀ^"aҘQP+ ^)[[f"ډ3}6o"8Eզ\T mzS>"ʅ{܀)̐9;"a%юxQ+CݑX0QϘ+ju*̱\ u[qLj;u{\Qo:[8;ZXr1mΗ jiOov}⠸5AQ7oKւ/ezawfZӠpqO˝uc$dF҃S#ýwq+[K7uNۿBԄqa *s.HkA'4p%Y1răw[jQ$x!W^0WAoPE-Ug?[JX&Xmt~LKsYS81CҦСe5=lT]ғ؁ȺlRa3CG7GU AqP"8 z9!A#UsQ%i?;ݼZt1A+8)5KLŌO@E4%Z?!DO \֏F#DSGH ss*4 (R}S>Y3Ҿ ^t;3:RX(B1ORQiqgB?ȶaʏyFmΏ}=GȚQ*!8\םVS:f{ńua`GA%򉺹0F$)OaR< [ UUs77#UT]F9渣ڥQ`kwl'Lq'Yw mن4DcyP8=N^Aq:uYx>R |҉עۼ({R||:Ew|}*O:Wz,Kyw?ק EB'tkU{wH;|.>y Mh G>v7@ ;y)Qn|ž8}X sҊOMI#Eqۑ_:RmGtϑFZ]n5Cqf/Wv0ް0ٮ̠Y ]u( &""<z9gm^3cO=Y"V{Q9Rr'5Y$_~Jpa6\_ s ̹if9|P&~`|V6KHd r+3!ES -*r0eyҕgD.>2 o%X3Pa1Dp8Xĥq("G A*-e-,0' yQQR݊[֥窥(%ٝbndO fo~e5U0Nz]ѻ~d[bk;Ǭ))VR"1#dRtYo B,v"LyVF]=sD,rQf7q?!p7;޳1q»c죙eD=>l],F#i4==r Cˌze@SsSSP{s@*,>$G<(펡΁O~7N밽41N0b^$g&e贠ECQP/n~kzᲄ\|dPy_ n~(oN6׼L7=u%І:+&sm6Sx4[&8NIAٌڲEڜ\NSC塜`d:Wi*h_=e\hʱKG2le2!˂ *0J wLJ Ǖ@"ZO,  ):a %*gRq_Mrn+8Oqc JK@Uy52*ˠp!%t9L.7(SLp<>Lq@0F fbj/J(򲿿nڣ+zz5x ,sW-`*L5&JH9 GJ|_M/hojfE {T n!z+״m#~:x CIti%ѣJRt(~lqį"t(~ miėHCi',>0][}:^|I/F^ՇŵLfp\9rіd5ǕwǙ2_nULTPavj7P09.@mJ|j.+[{. -YSC}(oW͖QN-Hm%-^oZyq4sc4@|A-̸l$L'Ԗr0&0i$0iXkT;3[>p~vlȗ0~ a~^,|^3+Fޗ8'Eծ[3||]^E]Vs3m3[LYMX+-| V"z!U_ >D{poW??NF}h`'̏q>x=ɯ`cc]N3$:cZL[3n77 o_tQY_}? wG8}M_0;Տ({?xsWU ޴93aU+AM2SƻYQb(Fa^2#( 4P%p0\;0wqAVZ)/U؍decCb40a޳ ī7`-~v:߃햫12|?Ptκɖ)aZJk(Q})LǼ*S܃e25Jك/v7~67U곟t2+!|:M tV½yr>"tz5[!Yf򀋫.}f EVbe/^v,U4q}Gc@)Lu9{RM`=`:w 1? R.7)}JL9KV[-"[-ʀ2.ZS8*MaV;H g{8W!j~?x؍cq@5EjIJ>'wC#QCvM{Q],v/a2'7]H\yCy[ōuެx+}yyw`EU'C$!ܽQYk9话T=~ѧ-}ϗZ}{܌AU3naFze Do6~y=|^rǸGYcP`x)2lY`ZV1YL:}qUeU;%-i*(R:]o6>bu %bLtVC껐&iCBh֘;3$C̅"a$$yg`(*CcfYMNL`AQ! rqf-IrM< u22boB./h^nrmQʼ NGm]b+%u鐭K FT.m]59 LT\Үz suC}1 H, ,-"$隋c'x;3 T6hid"5 L0>B*b@FǕ`22Zz ,Vw?Z [c`IQRYKF‘tHǛ NcM(V򷫋+Bck4Ɓ(PL:l^:+Bpo`A KD%(pX:nbpQG+[ъH[ZUBC3 pFMGMk|&ODm9mJjfά}3%qb]qzYHkV msQQQE-rA\<(b>寽S󣗇e\7aZb^&(3>hu/dgB˦eh^أQA̚GSr `Svl 5&Öֆ/x%l\=4`0ˋh|ļ`+k[+T>7;~0\c'7focBKUqʕ9Jcȿ]ƣ~6L,Ì߇şKiVWKXH֒X0l!BPJaz= Jh8Y}p29ULzBaGd@-ʣd O_/Mbj ,ūKt/Ib fEAw/³E hxiA'Ys÷V MHtQS܊u{q}^6%^*LL\|:Lߟ&*P0 P]a$TˠiyK P{hgN޾~ٗsZՏoջ/_9۔8˺36w9y7'?,& $__u15TN^j寋QM4CXtg20Qjmp82Y *)O'jNXt ;qI I|9k78]y뤝 @Fs4|;\? {߻>]cOi;竅((?3Y9ubTE4Ք*tL7 ۑ+NP`d` b|B4PrZTy0GW Y/RܛqaenOM1̓:{hܦowdN4uNOwtRQ6PbbQ+M_g Zƍʦ-5Ca!#g%Le0YRēS-oUYpr&6Ė[`1U7ËeKyRAQOE.q䲲M. ”M +P没\60G Y)AdweHf=iK]b[m o.j)SJ@>V1,p yV[^\O{ #l2T҂I E#/,QYϥv  1Q#Q EzJp<:,ZiU@A$g_FhHI)3!Cpz "V^Sknһ7X]} 5 @VgobݻoORAHָuƾ]tV{ }߽,-ZdC`6n,a$;w 2TP)C 2TP)C:g%I#_YF_}eWF_}e#@_yEhlɏmn٥-C҇ rtkJ/;A:}ȵ[$m{܆qS>~y 3y%WsjEfx0  *T@P4[92b k9^fÎqخ׬Svx,0C@ 2Xbx%hvƷ[w JL;(4_ӭU/k\-%67pO-&XkUQ iU ל',{:<",G\I5>8@E[RV$nk98y+DE{AGg^2'RLg))TY;Ya}$ _f^iPDQ1j ;NDm2]~!sFt Yb H_XYaH"])|5]>u#s\ 2TP)C 2TP),(q2+2+v(9C-CG!qסZ}qv\s\ as\!u8\ =vJa,WS\ {XwT[ʃ c6t@˛>Sc@HI'X᠚9 4+BtO_ޔ89}yN4ӗgan[UȽ2&Û o2&ÛLtQ@IJ~5nh&eaRI&eaRI$؍XS \=ozps9(D@hBD*N/JpAVZ)HM^Xk''\֟L?M;/ˎN^GLļc?'%zT}b4ケe诰Fz">?Ǔ>vj/zQ R\; c R*^VD:H`9iK'>Jޫ ʱ{_Goø&=G4ٛp1'X- eo0g| !ygƑxdHgq0o󬻡ѤXM:? ~9`Ǡ1%a2? +քH@lI]} 5EXe+㊠lڿ8!(VC?4F M+ |9"LDUw7<xw*ma J+Um@tV2+R\fJeuZ<euYqݖ'nYq&*LXƹǹ ֽٚ ?p]MqC:G=Ѭu."k:df?s=~{MeGY < -)9!Ahwd-97\ (X_-PHϞB 9g ~&e~ ǜ8h6tEkPBzlPkgYm DcǮ pcwǮ#NzvصEc[cFYݚ'i-gJch-$WcM~1[t6p6#{b W|ҶLB2su>Y`fꖙDA"bNc!.j@9bGO |1rO|1ĢK  4*P)jJkP,p-'884ڴU vD6YfFhCH(f[ Idx`VX ` 7v"X}ALq$Cdsؒ%R{p8[bU9EJ b&dl22uS&"E"Q$k&c2i&VR=/s2O)b%6WnYrG۬ՕvMdzȄSG+Ҙ+$oH=f;sEfZ_XSQ[@J['e8Rk1Lȁbk53\K9b:7HRq)JZ\n x}x&`,;%#TP2dI原|wJ$gH:NRQ9 XҝS+ғm꒩R ]iƕ=ls)-Lx(Gx鉰w鐻R.VPb^Jkٰ'J5zAPoriDea;hidC[71An)['ȴ.[E* y8HK753h5[Jy2P:'4A4Ԍ1hp&-&ŖFp)`! 857Ub`5 PTq dZpeatG1)u!ePK0th{>1P *#^`;1̉ zsDBbՂ&RVw)ڋISF"Qsk'T*p%,#0gDĚkBjcnAQE_򔱘k)0+ ׎K0(IuUȫ  dUеP`0- FxSeV)Cd[wE`f ^$ \0ќ: ȂAi@DzHcggbM:B2͊_VxOBY'܄cBl \8 c;}0 >ʃ}[:*< 8`r˱q OQbŘ2JE5nX(de,Ey(_4T yʣ &#$0 ]㞕u"OThOk fPf|Ơ/gNv2Ic:; UY$1+A M5ޗ_#yCi>#i4 ^%>tZk/f䥩&o jP465eP;L'&D̿( {ab=rq];/gk%v `c櫸^XpD!0|8r9u`i>o9#h"gYS=$@+`V"h`9 kR/(9n]G-ۀ8jK U2YЩ֏oM.\ż*pi۠&k) ] #I r"ӧ!t+חU^Qߙwu4DC.z*ce1F6F2WvIr1SЋYuD! =tKc=.a Bc,ZJE*QSeJ2fpNJ@AH +P(Rf)cyeV( @[#iI{fipH")YG2jc:LML\}  VC[ZtP6JBM2@gM7(!ck9Qbu\lMsE~L{]ճpg&jdg eV Pʵr'4^PQf 8n0q㕃?患Q%03 *hI>O_)_rvt~ NO~9PkC$q@@E&]Kf=WZT864Lɂ2cT0B$K8OՓ5AݿHSތ[Χ!MҌؓ@aRB^;(֋ =TF mߗnI*WRLwWPz!)C2ȶČ*-= <`L{Ps7obX"@ a+EĵQvbfA᚟r%&oЂ,XvȢaXzL zus s oljc-hZpө4 n;7vdL %F &_6LRb 2KqF2RC prRw:?o*mvaLMcLkl^k ;@vp h>>?\%sR0Mi3ll7o{ ~>8˫,z­g,k;KAgmw]VZ=jkI?7=́{EQޟ oJQ킛] 8痓<)Q4b\IM"}XӓD5/pR~lص#!. ~R~ܝ$֚OoaYN[Nj# ߞq'~tUnq}]hMDViu:9nҨQ_HeM/ݴ'i) qM0n! F3"""!n4l.ixDkݲzg,B ɗb ٩}u]cCocA?iYz8*c> J·L9øB~0߬9|46PA#Hƨ;lP|>(%' *U`2i[lBCmQ˲%8ȳ3]@D"XYTiՖ [k]'D Y҇.w L /]@]?sK`Zni ~\mc]|ty̺$]ztv#fOׁ '`yl4s:\g=wy ={z*p|Lԏu;1V~\[r~t|tUe hR+͌QP5;(Errv[ X)5%(ceW&HdX >~k/X'3v| n]?woۇwށw<孽V_r.v_ƃnj)lQj <єR)wRv܃>R >h}. oUי[dYxp`,M]57l &.D&F1,-ݶp7:eQ#P4wD$@>G<$,GwD$e+4;4S}4fMnK:mN|4d\CKSa? [ T@r>w \l*#=h* bti*KS..M}i*H";.hz3ۖ{!ܧ4_eI{X7s(iE'NhI/Yt$DB-:i_tvm2\t N/e[GC -ϱj<2r06a&w﫾Õ|G壄}_7 ٝo۷^r+ogϡ|`i;X\8_wIȾ+WZߘ-<:xEg|񥎿s/Zkkw*6nvrk#<dE}I6⳹se3uOl3`lf/-dojZggr.C&a|MEj-v+>yܥ_lMJi`Dž]J^6c2Ͽnyo=+øja/υ^aeDoN^Pz꺒~..G?]UZNbX#<+/LI?'xߤ̗f][}EO6TJw$͛bD!HxO}h?_1RJ/6Ihыdl%hAzSs)}N0ro]wp3׵D.R@jT[չLgjwwi ex;]m|r#ýǛ GWqxA|eb =~ӳ1"pQ}!_kx)s:?Su~xؼ./8??*lWb>Ugy4a)%5,2(A`L/wgpb_19W؅x3M)'A/>E"koMǬ|Kztj}J,M<` Ap~(p NRMq9^k%JKK)S[w5rfda#^y.ސ֍ W;bKS7޾1^ wf+OD-ӫی[4wa|̳mk&.!1Qo*.ޕq$Be<@ذ<3AռDՒ7Xl"Jn_ƕqd3mh-> }Wv1K*^C" i{KG x],l8p4C&˥ݷ*~iK[L_{3nMlg?#ci%e Q95YNZ͇Sm% k8g/Kةzβ=S81$xjFG4P$-?}{On-~']:|L xN{1J3OB{ \9z^*hCt揿F !gWJh"H,o2P ѼB'豚TW&XČ@lҘfIhjPibbE`"C!lS(piy_+\1g _j~c|s!hcS~P/FICt0&lM֤z=x,#61JTNT^'D-KMf[^K<Ƭs/I%{mE-po^׽y$ͭ4zB Ѥ Z>z޿msGe|K %kwK=$x`2=N3|cs*V=fvTNMFVMFVMFVMFu5 9"T#+Zd@7VrR& 0DKN.eRsEӺqZWtXW-YS Ñu|f$Uu&q^-wZs$NRHMNۥk@MglUܯ㏜I [SY gg嵼gr~IeH)~l,Y6D.YF[fQ: hMQ~/ʭћ8#rtp1)^b6Z$Qm{+]7y""vAmzϢ+G>,(e; 9,/bhoVtmcr[fPkYrCHw?jkh]|bEnW>PNޑms̀({l嫴iw.8Ѧ6tϳydpS>Z,6x&ǩ&pa!odb8Bn(Vxkj-3hB`Ky5}.^NEl?k%L%™ buE$r|c:SVj;ΔlڄP831Ω^OHDܕ>Is{޴-31(3 2ĂP淍 =謀 w`o R*#@߆njf"BWO5UFg奧+.3#ݤul,n_V-9/ #uք :[7Սō%f~٬b+~u9):+3Wvk2o*yI@)-%*W/x޽ݽ>(>i.mW&#¢obOۛbZo_@AeӮ*ݣ?1Ys h7sG_ D9G5U?8l+&i#?ʲuq:MuaBd^K![}ە=Lcq8s.My'p'w0FIqGI۠M[qbe+##RKJC,T)y Kۄ m- X!E!lKV u^ؓJżgj6RoTۢC;%mk C=kRQBdBOQ<%Aۢ@Z0 ږI Sh[ T Vr y۵mBjpy Q⊖^`{d 9obpX ET BόQ6nЃ1jgΕ}tL3lsl4ʡ%B܀r^MjC}y+=?jQߚk zg,iwz?BT2m=k z!)@)=4*QIw[&It?(z$\{H3.1e Lx[ L4{H5"R='3j=G j!k>ESBܸF rTf _qҨ;"UF=F%F|z[SD f E7Q2JhXڙa6#g2ԢX2"*;Ra؀0q8慰zZqV(|P1h%~5cTuv0jWLqtKˆ8,h]~ܵ9enۋ3oM>զN /oۓ/=@s=[me:MsFۄ+ سR~{AB !Ȳz\džϴU-x@&J _'JW_eVpZ R1C:4Oߗgsu\KXw(Rr]ړ/zrⶩF\( Y]~/=t[ptk3i]3 z1-+򽔽زc |{|9i (v Wϲ h/6ܶ53Kp~Uz^kϊ"ZH ?)2 9|bL||6 SW Bɭ`8 PM>[OV9 ʋoyk,@El IDv/HH8n(Y*#*pW{Z~rL0pf)C~NapIǵx7<تIb^ A(N>[8hMkÿή8sڮTg8sU$UX>A&4a2YiMن:&Vk`00}6XmyL7f/=,\IhwG2Vo2C(;cHL@J&d0K=@IyI_t:~mπ0nm7v6z#fJ{Lޜuk4B -0ZoՒ~4[j9P (z ~;]$񴡗Z{)$.zUy?jO-/Cnўn޿`~6X_{p"x5>=B:mR^e I pHos8F Q9Jt{<zm&Jt=? vҍx^oxVY֝m̄iĺ?:ůNa_Z~sRj1:֓LBiFHr3慎,S-S嚢q'i,c)- p2JVq9`1 g\n&,1Y8qatyN4-XL`n NdEA5QNT+$:Q r ꘈ$&Sq\TF@SI,tszP.PATQFio/XKݒ!=Cxm9y5uj:k5Оʴ0=q0;ۓ XVllر@ oNqW(WvCvXӝb[lhg?Ի򇞎 0ؼZ8Mim_j 8PӪxKlSM5sW߲;-S,!K܃TJ8NF ny!bO֗";"T&'u>m((#!UE=P@7l qx`!Php(EHw𓯩~Ӓ93UؼZ v V1~Aʰ 2(rE՞~-嶱= WpHaf6d_;iCԡsP 5`B1=_2ONm:,BKtW<xf2M' \* 7? IAeh "bѰ®'DWݩ{QMɃ!f)]Ր0a.s Y (})bKK<ҵ/'8B&s!_edޮ'\Sʩ0.ߚkz"doh ‡f̴H94526-ߕ/g/ 㲏GUFqZNOiG?#cR(-0¬j(a".Z[)G…| 꿎 gMi:S3ALу%;$LEif9iO)jZ Zvx].̓SBig|b]x1Ȁ}~V7>Pqzs!B1DD+uםvACj+;Һ*RK. JԿAך^fyoDʬ%Dhy׷ k^.ʂOp]69,0u._Ql_x>%W>\;3Z()9l8wqhiP%#O)xn9 kAvÂВ w >m֟D"C2Ss1 N,~}[g$]/#)êo{6=[Po@PErm5Gm6s'D$K>2Lw]dz} 6bxVdʸ3׍.$rn @kOVルXnW^g-bio<]hACD/O*>x\5 4}>}M0@m+AKl 'zUGa<PK"^O{.0~G$>P{-(AjͮdB=sڀ1B8m  *d%lLj‰T ^:ݢtt2H0ȟ:dwq6o^I:SD逞1p ?riF 3Lc$!lU~"ʬ;W:.Q+..UY  M"?+ыD~|?ϳ.ɶV)|5/FUsTs<4/9ţA:P|sAѹ 3OojeNO9ǺyZR*S2@ք#e$B7 Xnb>sIs/ܭ>}*:QW7lGiļôq"ӅjbwqsR>/A 438r*9H@ I_]ѬRDeCu@xi~Vk`uT/8(Wr5Tvnpt g4Y_,uYr[(5 q}2]ʧ>Ꮛ~h=._s<%s`DfnbFf[qO>NV|iEV).+ԑ<ʕզO` `3m @ؿ `!KqJ139H7Hpo |FBF6c11U-el2nz>4n_QB-"SGcT؉QQ\sh7Z,:!<:e6u/y&Sd'fD} eGT5cGOO1%^:V/٧DruD9r=Rd*= pS)gƤRg>wYuZ1Ѓ:Er U44ju<oM]S:mQq fLdeyĶ1 WDl[%ۡ| ft\Ol9t>eX E d#VY:hA7V3/wSP#X̋%!QBmQo]hVv 3 \4&z麠ފrB!)%{Iq{'F E1xoC$\ÒX0W} D1yW9?>:o<ѻ!Msč /`lrÁ_ SGrgFՒO͋JC .se~\;/Z 2-rs dB;]efGB͆sf;B`bwo7OZ=vr#X[}ە,Jd >ߊ>LYM^a ?FIHюu]*B^H;?؟m)7zgA:Ɨ$.ΠĴճRV݉t:;l)3X /׷> B*(z/Gw 7c@u\dhF~Opl`%tduG\~ xCVlI և׷+-V< XyM8Ix }&r^FxZ=Jן]Zr: ( +e+l=dRmL$aDQJ*sF; GpwF ̝ғX*\h+ܹ^gmE2S/Z󕌩n Uap(cd2PlŒISdY2.C34YoɌb$4sPDha"PI@ũLesLqG0CqynYL} L0R60̱,fm"m5ҩ";&pĘN:k;w)N0<\FOc^bO3 `B.ܢ&}0gɊ L,,q0D|9 uٔ \Smf)͘i6i$aHQ!EȥMʐVèڃ %n`R,IԢÿ2AbC$?T"gGʣDP`+(1`&0*ɄS92E8 2%4W ʩQ 'Ff"C?BC2 ݛGleVK)  LIpe' [X=4T3$X[OTQ Oc\.3!fK0#`8)fJ=oq7b)Hf! $IⳂdEv/%9=$Sӆ q!MZ0Tgʌ硹rzX.aApD+iR큄NAY DxпCd-@4N<p}bR$78(Rnn Bz<++acQdtfn©˰mDi{lEDc[3 K&J&LNʴ7f` 4u pfJF,'B&(!B14YA\yF2inW<BxQ!-x k“Fi;y}:Wo M{lM('ϭ߿r1_|6ZXlDo`nWeW χ X/8l 9s!b9B^(l" :~?"BF0Ub61džSbn0!W_,(tޠ0LT٢PUt#PD¹b'qnEk&~| 4W?+Mӛ, <_<$V`T2Ic+0+N(1006J Gt _¨,H0Y!%  մGq[ZJt#V*P>85|>lEXU3wUD";cU3A'GTtM+",u^*$b#P|-9lMs]} a+?j5"V1tɮ`]IPY\9ЮaG? !^7= ]ZMOh,L"c+q+2>P! ~{{!~wo]Z{cՅzW!Ty/Ų+01UOJ@K(p]E۟jUBjjc3eݱrblXv|b,Pj.E\Xh91]x4e\iQu5j;ȺOeU}V9 ŧԑG{iMJ%w|tR<>oItÕv%*/&ifcܥ^jf9)^m ) f%Z#S' kD;1ԲjE C-=NzZoƝZѬl5Z5\L*ES`.TxQ";yR",,s1M 'uVJ̛#_3؈ bz+W G3.ȎpYdHB:`yb2R`IE8djO)>jZ*K~ ?+Κ_!I=/ ږz*'ЖPC~JquXaIcB0p/B3S$VrUrUɨ60QDD7J.%2>OvAtx]wuGQvA*._"s8_"k$VGX{X>0DBLL=$U?ݨv7bTF i %RkEUoV|RZt\ sYjtR<>΢K`$umUjtZ4!QB .CFz[%gYg a|F볡 ڙ )C󌨶)YP߭ ^c2֙k;DjO;?e&Ae_}:*Z;{}0s{-l?KXd qdry97v1b JB@uj\!7*D¡HVmk5D*[k&Ixu@Bnr'mC81B!Ů8 EMnG^7(BfqH}Xm*-GCkWi )cw8cq#9mvĢo=s>ՂH0"6W "Hj;D3,Ka {F,BEfMyq1u{׶ϵhma9v-6E {? / h;gE"bbI9@L:R dɾM ۣ1 5Ԝ; LnS6gXjU6g쉖-aT:8NhuAbP$/#92'cQPR(O&WDq͒4܈;mC6!Ҕd C>]ڐDK!d(BV9KNp+f0Vq9jȸF"T ^ B*(O4# &DmSXǘ_DhT<}IDJP$U<}aSH4a2ʼnqa{ϔXtʦI9-NOdZBI OlÀ*V3qz!9%"&/b3P!ٻ8ndWz9gglދ4vEu.ODc'HnI3vWD鮪,VYUJ)jNSÔ13BhVvyK75߿ݕ&] `JZ-3isͤբ5(ZC'XáHӮ(F(S*%{;<9*4P"]&(Z+leG.}KX2xڈ=rB ]ŵj{F]=_+h 9q:ebJΡ.w\KbJ^^1GK br9"Nߞۋ2AèU knjqn\qɻyxCڷ,IKE3V2R \XEa٢P-㽒e2=m=ٌd_+tM9dQf~n2GTmL㬭d3ǿ(3Qsx<\;3bgCa&QA Co&`gx(,(AMyG:p gԻC P8Jַ,t6 {o7^E{ϙL7d6)OM7Ң2p2#"ňnFHwq DW#k-όbM։ Gb7RNP~":Er};Y5w>tT5Ǎz^\oCvPnFM6Cc,.ޭ8t,Yӏv=ݾiD O`>w~PRc3Xmn0Z͚?_of#txR<_<+>_D*L=y `> ]DAr$LGx)J8%  2P%'N!!TENot䵂pWŗ.7 $xGH XN \,c:AVfXSEV xsxoڛs(rh#tnT筡A@ w4+ GMb*YF%2i(H[rENyW1UV\B~v[]N941l @n1iaT-Ę EhxehbyܘHyȐ23 g# N1ќWw~.uh29aVH9bQX4PI,A"g`mL8B0B8WU§a#FVC..e/J8^NU'?6?_zKpDw袲fT7ENW/}NjD oИ榋P"CkZ5k[FiVΡ ^m/[y@u; h$u`"18μȀYȚ6:, GR04U=Z9]X E㹉THYAA #{0D aw^WhA55њB%E,INP?Bۻ[t*TRD5G6*GDMh4&y 聡_L uR**ln[QC"QpvQ4 ļdJ(̑WU!+ᄀn |EG5A6q&N5g»hu!n%ާl)%#`X@\QMQ(Lexz7=\ $){J^/\n~[VZ,#ڰP-Lm.NZ0\|{n:iz|Z~|Ehc"ƫ.NqU.U{_W6h 4@z~6{PFY/?یdUи=lV^o o`9`*%IxG;k1H>}mP g |+JY>2*x2*^c~H»IJOE?MQeZdS9 ٓ}l* `MQ׉8Ҳ`F;%ԨAAG1à P1v <`J}T Cf2gc7҃뤃 Z&Ƣeu"V+ZQ<1H%6ϢwjX. ղq!%$SlSVmy.7%߼Yo~ɾi?!k3/iFNl`K,OI2*M)8LT5A0$Z[fN v_&ݐ⃋;ff6Ki˧g {B庛Rl Wk1x&"Fk!ܙS\/7}6z c+ pyE6'Ry.n?7By0Ni`{q\6 ޸V>H CKg@s}AĜywܢ EݬV36a\*Zvx5"bm넔%&%Ą+[l{+Mi3Vz%1pyʔݽїp-gW84Ґ Oapx:WH} l˼[QSkr_V ,F@`=;N;Y cxu0l1a#-@r:D{!t̐"1z딙T?h`S-Rc /vW|tQ`ct PR7PN #h~ z tmd(t--JP#Z4:TKh 'иH0zLIzRO:]h閙7ſ\a,Y8m]®=?y.W?)t 3}3_T^ Q L"f:"c q@Yg }0^O|'P zf^RaRiNMВ17]Cӌg }0/y)sјF9RlJ'P{'wJj|`[ۗܺ'}}̗<%u?f4%(Uٷlc,ߕMvğ\ݵ5'˳SlX/YԼ ) lv&/=;;".,XU*rsvq5,;X`͚ v¹CM3#amB(p*&'I,#%FQG!Ej z  #x?ːnJҋڅ]r[XyX _yq!6U^`6 {*pKbuH+|^-ZU 1r.Ѹ&8Zۈm*=YXAC%sCPv{6p q(׸'ʻ ʹj:7gcK kM" !-HS7!6BDH @NC,!i#s#"OAy (8+h#@{m58(zφCmmxޯ2 v$"HL6-7i[^TY> 鴃p$da[xhJmX,4yk_|~3Hj<+rݏ#WGP )⁲Fԑ^g݋Ar{EW1DmH0n>O(%͑G}Ic pffznG@%UqތcDWя}НTxhVzǹx/ԕ_sB~w mQsʕHJ!౦(,Mv> 'k NQ3_{R5f¦/:Bu͒OxIj- <`X9΂tP-'^FUH2]F!9ޣՁ2 Arp*굠npSͳl51Ω~ h*58¹L"I4:T5n)r>mW7 ,nCQy _˘>Dv AAnqSe|d,r ptTQ?][c`'nrlD5@PXlZ7ol./Bu[$Ltxu7~tIՈP͓(D&NGI㽏3-qQW}r&"V q!*ѱ,o1D,t"XJu ^?52X+h;_c#UJ{Ɵvxsy}8&2/@"'CđD&NJ)-ywqKV{{2$}%ڞM쫾vH;JHf8TqD}pYN"5cQ&j㺾E$ڢ_^6YLsY7igݾκ?Qs:YSCOE^p_QȤ=[Jixh! 8cs_o( S l0j::wU$A5d=V415Gt aSJp ]X7|fٛeojYӛjV PHˣd($)2hLL+kZ(r`*b|p UU.~0+3-rnfF(F_,KY,egM)fMH@C҆{RHRR9<3D Mڅv50lr0ifkiJZ?{剦dQډ$4$PXGV3[@p] Hف^(ݾ:jl^CEpz,֠Q@ci,F1paLgk-6 Yn)?(&J:04J¨釓vXlTGU/"eeLNpKM?N*kIaU@`І^Zi]Uљ =:㖴1Y3bYJ}sY. ٺj,qv0Hr)w\P!XI&˄M)AO Ɏ5ֲ7,Ǖ,SޡĒ_@L7^&QHHEH%ȴ?˳vMVPD',%zD˱^Rg'[PDyK`i?0~52Yre _;eƈrV$h)@g2琀DigDW^D7ˋ jK\`n6T)>5f69f<ʘRv,KYm2sHT w =IpJDgIJF X`:i^TN#b23sBU\wBfS>S%N' L)c\B%J4312(PO**=/qh\pU$X'@$׬ZdEA)jHr RF<Qk+m.)8J9ȥ) DEPI-]-p4 V٨l qȧ2 m,=ڞH歍J+MZhqnX +$FBf_9F)(ŕhmS/&ޑnb9yjr}..gyVBnn[]% /$"Ө -O{dnSPB`;JQlaCႾnΣh&"h]yt觉TKg 9՘" >#3BX#n<@72E!!Ҩa@yw '<ц1 N#~o9F$dK2p2(A* !]h L^ *]6WXt]+ljkerr8F~ J#Q5"ŇYc{$_z[U"# .׿u9E뢖H=e9Z")籅T #9甌V (*L2&<(giBa*N,SE(L*N$j'b)B\T"{rUfi ?ABߧ0b:"XXJ.4v%ӏ2NQ`op-1WJ9 WP;ɶ3=zH1 \O\TVj4T J#І]`IwQ;)X(^ZeNPs*0Vy5Yr#O3w7.p0۽ӟ=6ₑ&'X6'@D'=a5@BqNA>{nMӏ`uκi.3]Q9ysy[Y6yԘ"ΪWCa,& Uxbbp$t޻mfύj*x'*'ۜ0D>?e252V(H |d 1փOhӓ{Z@.faz=L~t>E^H p1|D9i S}^k&p Blm^(NLGTK|Jw]~G>B8Fm׍OX!0=b3Ac{)x!ej@l;*Π T 6M]~y5pudsl~q4{]&׃~|[ ^]y9v}r,^NU||L~ۉ _0qVWydLYx$9#pbci#DŽT1RbQ!oU[&Zk*##&Nbe,0 [e1 G T3KQ>:mx݌.Id-$FeF pgbd$cR`F),ZGU@0bM\NƊ@Ǝ+%VıpTX-a00謄ɤ@ȲFƕdh,SVE ΈFZ6:=*)m4+r%?e2N4'4(6[fOZw )6iMm%[rw*;9HRMW0 aGO&z_s1 ~:ͤD!M2 -488o!n}O%DCZc9 l#0y (3 eE5silZ|K+0ϒ-cX0FtNCHZ *s 瓳3*5Inxx9$̎h48з)'eF?R}{4VՓȲFf1"zqRseM05;'aZ#koh awI}MgObl ʗUtՄ2G3*Ts=*;TgtUZ2WO٭㦯NWv ^CCs=^RAϏzPȊ;2gŚ&,%reN86PB?U1 D3!,aSvi 80|[vPC',!?ǭ~潝-)~ߎz-8*N_FA|<MMG;-dMO0ǧ@;.c0oޜM M \_NBe! (6lvi h$Z=O NZu ,% F.1m}o"<پ: B06oeɋ6#"8 %Q_*(t{i %4ƧK<&v!q1~BFNcM-{vP+X~yvfI q Ƚg)']6a8;8eT:Vx(/V`A=$ܭ@sGlA;>sCeΟ+ϭPFóaCW㓢{柌ˠ(MU)df'mKP7xRK˗jڗ9 :sHX-e} Ru 5b-~ p9̚#b\gl!s<)ֹBfuMQ`qr/[G6\InD->z;"+1s r9/P׺BHٟC!J uBHot^21e-\gS ji2aTD:{VP纀A#`i8ҹme Aϧm%PrtAKxNê.8M@XKM iD4 %?X#M@>YKU/׍%p{nq=iֺxH<*(?),kD v|h`6ZDvQ0t;!7RLRS={mh^F/`:O)+N8R,| WBemͯ!\PЎ?-4-+٫-q0{Z6 Ҽ ۂ>amK4e^Ϯ}n T>jDeQ+8قbtEޠ,? P<TMi/Zb Qr'?Xn4h 5"u%1#(q`JH艨8 *ǖ@qA+(FʡQxT)822QᚳB\3<YVVT $W@ay Bmo"p``i6j^ʰu?vB^|2)]OQof|va]Nց/,ZϧJR)={n˶/y҇a=""D$\Jp/zAҟLgv>1ۃoqIgD3͇Ch ܡmZ z:x #J)`$c ~fMwN0,m}"\a- ǞO Wp&Pa~# ggFi4 C.U`V|82zQ[->hr| o1G` Fx>f`?fkL 診~t`wM4ffl/Y0d>9x1XQ#y&l|-^A6eLmY;ct^fmznC8 a!Mhyown{o^M@bL h6`E1J* PݏBncceX``;V-zt[֟= ~s$S+DD826"&!K #JY@1zGv{4o5~8MJd -FE8# 2\[$`1ikUFKI]̿Myh C|R|羐a i{EM{7[,3 5ޗyuMJEY67Ӱuv(8 mVkqZC H0XZ [n|2j*+V Q*6U&ۋb^HQ8BJ0Q=InJۍEжF@D~Z5QDVQcB`II?q*`=,##7HUF qFPD!aEGLr3jZ9,(( |hF\PJ(c22VHjQMR7nta=u*T9x< TcDס C:ltl> )X˻٫x]Ys#Ǒ+yٰ ՕuL3c J~wN$pF#y4& $/+Wk?)d߀.̴\y7k]Fy omm%y/AwT*g>IQt/هmom87\z۪>ޙrS`iI{J{*US"kTV%,iS:iZHO}CS01?};;-%@ߑռpSr#֘6-O6*q/CR6Se|Qlky{4iv]\]bzZ@/un8=4\Ĝ^$qfx#p$r-|FYR@E'csNqxzt?<)#3F!4PUOURN9+k_h[_ZԿFrTTɶ꥞J0C=ϔe)fKnLTOPvۭAX)H@@EBtq!OQ:TRA-`Qp;%5k!Yy"jߞ}<d>:ٕ 5gq\U2A=։vE~2"UaN;f9F11IdD1$KuI!Z5/Tp<( m5ߋ{;b~3uےga~?I>i!3+8 ޠfj<gCtIK.%I[D E(0xI P s?\o;{WkfxF"%" .TXg{nw .Br{oL3H ) j s*mzWN^Ak`1q& wl)-#Z.=%ɎI#"F!o foWceєM6\kMz\C|H}K+շ"\Y 'NrG|{>oێw2䂍gBVe¼EUIԁ)mc' N%m,FI) ƌ g:{kJ 򳡂;/7븬^:Z`ym4Bݗ7jNo.fW/= ? ~a#loob<KNfWH׎#\> eIl,cVWZؤrO[31hۛ7G'r2Z{tѹ#cDރ1,ZJ6~:d&Oi!<ϮfR@'5yk6U,Ys5" `K/};T!?8Ͽ:*BO^T4Bϴp9*,Zr%Q08pHzqMQad ۩ S*D_7m{M=ܣ1Z'njC=gj x|\jvqudԋ:h(G{ èGC zЬ1`rӼ|C9aڛ&Xm}Nߔzeъ _}[,5J?u/Ie/ 7#tQ,٥sG<>5_8?Zpѻ%6~[hC&#y% ZH&cr%'~C|8c.>L@q).!F*yϮ}!ML LaP(0(QjQ4(ݣPеyWjc|%!d{cBSP.AD1u("C JT4c/[v[m0A4>GqZ%PAW|>fL͔y4apj>L>{z.jxΤЖ")A:R޳HHԜ~wэ.i"q=<ʂ% aZ_@CB.`q杭ku:M='YI&i:.PFhFnH#G@m`sEڝ~ BMNc)BP'L7L*qyK?}a?&iW&W,~D+u|w;XL KDe ߶D]LЭQi1cU INu[Yryk2)]Py$d|XZDŽ*nMƇ4>d!;uq- qs nFHk/R1iJD.aƣIG5[_'/ +l}IU8e˸Y<|ڵZqmBF[906 ⭺@й%0L;I8 J$FgHFA|7lb<N}T89H#@3|4APqeRNòn"&mdYi#C}ZsGf9u$c2 8\RTJsPm9lTs,F KD D @y"lܒCK(Wч1HjE8%qz~H ņm?"?%k͙RGp ;6 +bhB'J<L QM`[B˽= SBuJaaj\Ɂ 0l,Em-|[ߔIY(7J-s2"1sTeT@zSCf]݊L4ƆgSʸDӡT{f ^~R8NJMnK6?ͺCQ]P Fj|`0Ad^6EʧP@C=z"(Mgݻv3&{B;+A:hжє7+(#dj$ʲ|B*5."laP]O`~ Fًwg|R;2:%RQfx،? f⣺^&+h9Iǧ?AD( ) ),$`7cng)|_tHBugO:fp V eHq"uQEr{]GC8̈0gK%AU䬥Jz$q%0.v1t,c^KB8mqHܥR_7W.l Ѩ, \xSG 1>0G:`P+y\ Hc1h C* .ŌL2"mnVƀHk_%ﻌ̯m^Z L0`L0v1*#Ky \Hr`UtP_~^(]B` 99o5 8 u#xT&N-a!_7fYb HTF04#9 NA+<Θ_j|22АDb{n1[|KNt=l/F?=~#C5ڼi/sOthhkmd_חhsatFtf+ [ -\l06@{5;72 -~5&-yCn}M!mme `fm-M75\ 6 ^k[]OvfŠb}wW.6z 9:p ߧ}>m؞&ES`/(65kikOfQ]ŇeR%fygۀ~> ZwmEsRcus 3ӛ{ loyh1sDg%jnJ[ ¸)!Z)4!4C'b`'Jќ#X";k`pQ.jn$N]+,;˰sAstjwqݩh(3na 0ա෪/̫!C03+N ai~NJihN١t/UCe/t1H6$혍:J3rYa`k9ԃvSHab^F ɄpRLY.)"=磪x_ xzlE}|WuSaKm7N$'I֋!2");x8gv{S٨8m]O~A!I3PtwdE8ֲg'G:?T:~:'n{a{X?@((CXqp:p!#IHLJœl= m$î64u$ \-2Gy|}=6'M?xIXE;S*vpUQA%si<M!)##Jd9Ng!LBl _G{zsƞZ@dLja^[cx:2@7^sI\jD t~s:w5:PiQ%=[#Nޢ A;dW㼎$( lSj`dS' mM%ku2ɕbo~X+k8IsiMK3G< hK쒵UNUʆ5 Ϛ6=UJE9ǡǧn>֑(AmSFT&E+D:]ua NAx˫XB'V4Vu(Eo%NE'^PZ@9P\_[6P蜴Z#E#Ku~TkEE]@j-jM|b`"o (jCԫkLې|rڀ6mEv99ni1㚧WM1rTsUkO [[aijUۇC4}Z(FKO:[S.yꀺ.PꏭIcVom‘hz"*'l &E"ӭCAm)#jוi %5'ZfB"7"T3M"kjȾ6{ZڝCY3V ]TH6 FXCYTV:-y[O-[Hr*M%bqu")pS0-,Kq.F(a]~F&sdF3;(m"Ihf] B6ƀ[w.nČ';[)~}Fk,//w`p>N 3G1ߺo_.s4%TKҜK )6"FS~bM G5$"!ftXUtMgF`Wf 5mʬ#&2uk Z9ZN`ߏw{AJJ7@ ?xyl,{XX/a w,s5,X|wD~L>[,LV>fFs;RrMXKrQ3齇yd0M2ܷ/F_ͷ_qD-{m{ş %6cT`Fjm \KS%-4 xdvE pmÅY|~Ɣm- ۻ%ƥu]ߔCqa%|fekoALCL }DFB1g9,ʙ4I%*PL,# ^.kbou)9-HRLYxYXTDKpm&\ !'@)΀Y :r#2fldI 9R`a3CPc n4V ?az%DҌiX+5l yY[G&H6A0E ,yb N@+)()PsJ0l&!A )c,3q0('Q$#I:$ix\Hmr ̨** -FjuNK@Amu *;n%[r"%S núbPFt|ۨb1JʜhȺgnMHșLQ—M`Amu / ֭ 9s)jP_s_Z7Mx/eDǷ*-`^9pʻ[+iݚ3Q/k *ǼF5gB@3Uś9Ud㡝7.P*|&5vz-11QMPukP ZUQ:-Zm(&5A$WO)b.HiTx!, ļF5A׭52Ƭ1cMjx1f%)1cnT$gË1kDi1s[Ë1k7G$Ƙc Mpo(LY+c9Ƙ%cwx^8Ƙ! 2b s17 HC1s;<ccjp1fT+c>31s:'9ĘA\cd1s@#^sb9Ƙ^wNcXvwcclp1f`1c̍jV|p1fN(1cMj /Lܽc1&4:3Ef[YĘ)lx1fp&c9Ƙ*cc1F5AE cf1ܨ&0DcfTc1S@K(4P1po=0wobKΎ'//ABi||bo7?&cAbe:jLhxn'ZJ Ke__ xϸ4rY`f ?}.}vBS_JV MЄ剀A`NdI<YGw̛AZHogfggvgfFRNBHBKZ A/eԂe%Lt4#0e1(+mYdR.dՑ$옙_- %̰LKቕ$K *т=k\TQ*65eF#,ABHB0ZY v2s߽f3n}҇[͞.~srf..&oE^0+t`'tB"_Pʭ9R&K[ܐp/BwZU#@i5qs,$ҩ)8hyYVn=殦{uu6M(aB7 zl,giJ$q0 qаr=!LS4* nړ}HnERw hO /|ERFmWsqN7^/nB_f^k{خj\eykƌYcÃǝ6+3gl7f a?j񨜭7U;pmsnwB'!_y׻BY{],~6lTi0Lӣ,Qv&5 ']PsqO[L(mf2T qm5X@U^F-tSTCnT!C,({~ӆ-u_/ r;9a.ZG?]>I6+էsV{V{V{VwYվ!_ϖW*+3[RNs+*a4%θҖL\%Y /Jb%-2"ߟ15;2>!mh:zdw"xwCK%N̊.†q#CkTDCmLC ^Np}巫ƿm>8l9jm;+$vQq7Rijy6^^v:TݩI5s;3hzd\ӝgu/^_-\??}r|Wenw\<a<{㣇O_oX9+6;wh7Olot?v&l3=-GYuۣ q}<ί>vP:Fcx|v꟟j]ݣ۬ް^ғ:aHsoכ[w7 '< ?i~o S7Hl VO«xy}3}Y_}~yyѬt||{gEc 4Tpow|0fӇ%@u;Qͪ:w[&՟2o|'DH\O09phhti~ ^oW kg&"滐p됯 2V7irNj[)%FNցgWr`1\-ЉR: P 'ëd䌋=;Wk:jAQCڨ1E yԽŪ)cGGG1\"B] .%VQj7Q#DaM55Zǯq(&-lbaF\xZuRIg&5BBԆit \',aV1"Q=PsmxZJs{-ajNv>e`̇07JiԪXS#ɐ.(jZ0)G~蜗~%TrZVޞA &Hց< hA%bQۄcf5K%ZC EEMrԁ*?~P#5f"(Z@kpI3UjPǚǣ^/N8j5(jA@Z/hP 1$jy^.hRP"qN ZSjZ{4R:a>@#V~5@Ծ AB,OcF ٤C@@ Rdwj.3!XsSajaut;f t͘HD DI35 L?f6`x+. j1(j,e\ЦrFp( Fdn0LiaKjA{F_|.M)g[ykȑ)!ܥjX)8JNV"Fcns#u8;h'$6F$$+eA!XO"+$Q$k:5e" ,5:88@0ˈ@KUS j1(jmJ0FE*/k}Ph45B\u<0f~c>)5BDW Tbd,^B4>5(XXK j1о&񞌊|J u*_6))ZZFh*-UI>Pk1(jcJrUwI St_R2::P#k@@qQogzt|vog=xa!)8 !DwA'h,%ApG)fR&nx(Qsc05=g{fP\+!W:R1H}Ah]\!Fu,?N(Zcf=P @k42 61@H5bki16}P>8cvk<>88>-gofռc{ۓ?8|9:c~3aV\!ɤ~_>~El^G#P=[?/JOlyb]khIG n\^ED":fM޷#Á \m}ɢ?E}T#`?]P_ƻ)te^L^I3~X=NW8=j0Ct̫g^~ JMmWM.zֳ>{kTg]ќՌ}oKG9nG߽Z~ۺ3qegjԉH74&,ՄBdfrh CZ2Ԃs͉rknc۵٦l~~Y/޽^"h/I.olb['mߦ7D6I)M2$"8>q~KWf j߆нG3̞oC7N|3]+щ#,.iAmnF/[ *}TjgWd$`R"(lXI@73&-^r'JPC B 򨝮& K|}.5j/( ?1FKU `i*njh% ln"%+,1{xyU3B vS!*JQޯjPJb5 ?ybrVFjΈecAji,*(N*P51UCt[)@hCq"e"]Pu+ܖǣ[⻧iwr]ivylcqߖJ=w6(>)g3xagG[_]}2cra]ﮭ9bV GZTqUͶn./&z{˷rtOHN2$Am>56غ'3Lժ$Sփb S\W@YrD-Å!=!/)D PcPtPmo͍k}?1m}I,X&9JKWZ=ɁHr@- EQzOw^{osGLMfG޻K]R-Oc${Z1Auzɡ'jx"5D]AQNG-2H"(%JuNAQk2uVUR!YNz$g3p#QB22Vf3<P P㠨 -t e[193׌AQ;HY(̟@:5`QcшD5WRt<ב**kIEPFcJz2$f 9 k=˫?ˉ97 hZǢoc}UCng9׳Q{ghb'mb3M9ڇNO_؅yqb=R>>6!KrVw^]'jQt9HφB&5Z;k84,f̓a>R2FȈ@F ńH?AQ#U2jȠ!ͪg9]\$`9]0Y}޻+NnŐ:F3dcnQJTXkVږ̍q4kK8#(ySZP1NЎQBJ!55-a%:wA]Wuc>rGȓkx 5r:Cu bz]w|XTACDԑ;ڌ{tvV`R5Ƚa&dԒAQ+"CA #w*Q"lX>DE fLJi$5!Qk"LC$O"w8;obdPԨTzt:}4Bn0ЂJ!sujMDrv :C4%Dz kaM?QRaH@@k|BjN4$CtDz \g8Zs"u: 9/2CE0PâF-2T&a4Fz 'i_,|1$HK\םD:,j _2p lPL97u #$nf!}4 L u# zaQɖC!b #dȒj>,jYgZd@i"b,>~~7ՋK~P_oCUa%Ls CE9,h?و[_[پv½Am)f/]+/ [^]ʛGM4[c(C.wbi{}<BI3LzTTϺEm;aVHphs̯9ALٙg-P\ {y8'0QZhƕ+-ǥq4&1xWy%$y4{ pq*P]Ƨwm<]!9Y{*A<`"L'b[ϣ#N"rW0yd $lUdGC$0,4E}D)2G݌ØzU[hXRH \hfW@[!5a%j| }>f =b@%՚"q6h$J ojNK)M%"Q Ze35ڀ(OVYP^KRR(j`buE-u BAp6󉶓ożȻ3{nQV~Ӳ7/:3s\_ׯn?g %羜'v s"6_fe{oo I\+\?]UZ( d4-o߬=6rzu>ьV<ܫ|v D(:c 5רW&;^zZG!,5NspFr^NIYεdr r7M !ur`?~?WhˈL*ϻY!mayr4ZAQs䘁k>|Ur3=g(Pҗ09CI %,r9M+!sߗ+ֿf#3nq`j3Ǽ`(:Bpv:>G!E c ̸JuMKt+[w:L;%VTt u\-N0>V/Ͻ^{ƦFJkHȥ mFF2WYw)>ͻVnx2736t:͎uK7FS,6=4:G+3:B4ʼ)3SD r)"r $R5̑ Ti%uVF B9B.gĂX5(ˊ[@$ר2@Ki3%k2)ZF# 9Our_G?fn;~;<wNGeOI9]M.^](w[0!&(eXv`eg uhkT綖%E@%"Kp?amA^6\ZHU}B{QVxʞLXIHs"=j'vTjP(H) i:jED)\ڡ8,jt4j7h# ⌉35C1 ʵ[ޜ:C\/JAF7ubPLtrш 5?ێRRMhESuS u/,HK=4EN=G+n9_;ŸҒF! =#WIPW\'vVοpQW_t֓_&B;1: Ӆp)GZ5Znw({L1cQCN:9$ӃNNI7m :P1R(yT Zq2py3ϖd3]$qP f!QHLwA 5hyjPF7YփrF$r}6ʵ3lsl âF PA536:뉧gNlXԊakM z-2hH IkwP GVr$p@*}4x &S>|W=w|].pBՅ-f 3PB,ߖfg"4vx߿Ot&ۋnA?=ޏd|z=xYp`k-&|e/E|(&1vl9;%f&>ͺG Ns{1Y?)6q|m3hҕ,}uoDi+eW~]>˗q6_H\UsS>mVQ{bGH3sn)6X-~Շ`1f,wt+f#׸*F߹}bv̺ף N="-g}K+Ǥv?=wD׬䕨ԌAYn?7a ©&˪}j5m+qv+ ļ-/$J&a5yLAMÑruG*|T? ӿ#;>'AAcstqş.b܏°ƍqlt˻ryǣ^/'>b"0xsD J0HK'og?5`X{Eygܴ^Nm?,GW7{n=_;{rb:^~eXy!#c|^o$ u=_CVU%'Bv>?;_kd<:Ƹv*ǝIK2fWݚOGRWNYGY͂eƕ5ڏn7*RkYoeeK{vjH7:iIv."ewICpM);V#trw΋^8즶cajO62Ǯ%Y=w(G/ fr1l[ԅ"J.ohbD%:PR:C,nۡr{$8Prt4j1 p!;>KaCz>0IDeo<\ f,>@R2m9cǨW5Xp皠!r?m,41|6Sˉ )}`Sg[+J8;X8 HqC ik0n"{:S10~;c1ŧ1c*oAv>[bHV^MlWa3f^yԒ/uyNfڿ/&p^zJ$z"H21;:FEFEHOabB si N;[iDzK6bx +ҿzpVV EkFP2$Qm]DAu-2Y$}ɋ{Vj59]xƇU9P&<`y!ZSޝH}wCg~ 'tjQһ0"#ҽ[})Zf6G<%!؝V,"b3/G_Y] !ByfnآѹyyKs l eZ[CKo -Y܈yo -O*䍜rU4i?X:*鏶 ߟƷGۻ1pУVU"E'cc$i& s'swmswqڍw:?Qz%z۪*)/]{& SL0̳\Wu8\&¬Ǻ&ޝ,%wZU) (z\f8l[ .),ԋU4=o0͔m"Yl]˂ :rCkCbW۫O:ӣy\hkrjEF2F؎CQnҦmZ [Ipd= tITӈ#l4]2 FGS$Vs?8SCވ8s5SB衇6VBҚ-ѸWk\;dLe`G+ ?6IH35ԍ'DƠYm0Ԩ,a") EeqC`à6} 6zF4 VY |\fV=a|/p6y㛷CK'٨7'Ff}Aa\|I+{;޿"z.:RR034Л3E[{y6 z[LjM/B% L7)\<\=4?Do̔%0Əv5׫p,k.Rӽ trucIuiy{[ 52[sbO.;Gij솝bC̖j>[Nd=φ~zu/6c&xGYN/5ʰe" \U!`={ΉAߋ**'0k{HCv"qz]cLӿzp.er{.A^ZՉ2=3/KM$ffd1r#2o^$<׬"Gc_O/)ILIo0ƛж͞޷(]R҉i }o\kv}ꛥ;\$Wɳo|uwh\ =3Ӭ}jSH.+ݼn~-I9MWUoњþmgߚ7C)6XZ]tMAFMښUK[mW0[ոL~tqCV+?C\ӺVgU=[|@zBéV[VZ-nZqgV՚ApT6j`{>FK[V9u"vu#d |Vf:s t @w{ݼ~ھ_my*5K\#*4FMwɗ|eNqݛRӈn͙OtB8e&\YM;mEƀI;2Lo~ά=czgh14eK*ݥo4=:ؾ$jT_DE=!t`Awx.Y9G(Ax,~4(t>dՇ%_;N~t;݉u,=q%}2L7,6#btmweL6)ˈʠ>5`z߷q6i٨6uly8lESBL+6ZeH v S6n`H?L^WX/so&= L=Y9z}P~8a|Y^? ~/sn/ LNG=BGbJLc'(ȢDemSk[ kl*S%ՙL5==ţj@~.XF.<2JwU>4)9\Lq_?,}+:z<#SC8Yܚ1 ȣinףML|'GWfN9R 3Wc'Nߕ'Je38N:2VY^6?¶aҜښ\>66DzVJs/MSz)K] 4A?aB2 to-3IK}!LDo2 j#V!"4W}qajShl=PRO{٦ map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:15.551523076 +0000 UTC m=+0.665137317,LastTimestamp:2026-02-24 10:16:15.551523076 +0000 UTC m=+0.665137317,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.563043 4698 factory.go:55] Registering systemd factory Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.563099 4698 factory.go:221] Registration of the systemd container factory successfully Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.564810 4698 factory.go:153] Registering CRI-O factory Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.564994 4698 factory.go:221] Registration of the crio container factory successfully Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.565241 4698 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.565527 4698 factory.go:103] Registering Raw factory Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.565691 4698 manager.go:1196] Started watching for new ooms in manager Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.573659 4698 manager.go:319] Starting recovery of all containers Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.574794 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.574889 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.574909 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.574926 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.574944 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.575021 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.575038 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.575056 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.575075 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.575091 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.575107 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.575120 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.575134 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.575151 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.575163 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.577661 4698 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.577704 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.577727 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.577747 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.577761 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.577774 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.577786 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.577799 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.577833 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.577845 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.577876 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.577890 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.577906 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.577919 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.577932 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.577947 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.577961 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.577973 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.577995 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.578007 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.578021 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.578034 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.578046 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.578078 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.578090 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.578103 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.578115 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.578128 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.578140 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.578154 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.578166 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.578181 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.578194 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.578207 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.578221 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.578237 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.578254 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.578295 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.578326 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.578344 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.578362 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.578381 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.578412 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.578429 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.578448 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.578465 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.578484 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.578503 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.578519 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.578533 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.578550 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.578568 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.578586 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.578602 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.578619 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.578635 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.578651 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.578668 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.578684 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.578701 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.578721 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.578738 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.578754 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.578774 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.578792 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.578809 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.578827 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.578845 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.578862 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.578879 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.578898 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.578915 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.578935 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.578952 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.578971 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.578988 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.579004 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.579020 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.579036 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.579056 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.579073 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.579091 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.579111 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.579128 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.579144 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.579160 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.579177 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.579194 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.579213 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.579227 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.579244 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.579289 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.579311 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.579326 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.579338 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.579354 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.579368 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.579407 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.579420 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.579459 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.579472 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.579485 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.579496 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.579509 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.579521 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.579536 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.579552 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.579569 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.579585 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.579601 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.579618 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.579644 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.579663 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.579681 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.579697 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.579718 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.579734 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.579750 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.579765 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.579782 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.579798 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.579812 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.579829 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.579845 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.579861 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.579876 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.579894 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.579925 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.579945 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.579961 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.579980 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.579997 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.580013 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.580030 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.580047 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.580064 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.580081 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.580100 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.580122 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.580139 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.580157 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.580176 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.580194 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.580214 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.580234 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.580252 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.580297 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.580315 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.580333 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.580351 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.580368 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.580386 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.580404 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.580423 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.580442 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.580459 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.580478 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.580495 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.580512 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.580531 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.580547 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.580564 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.580581 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.580596 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.580613 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.580630 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.580648 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.580665 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.580683 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.580754 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.580780 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.580799 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.580825 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.580841 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.580855 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.580873 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.580889 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.580907 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.580925 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.580942 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.580961 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.580979 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.580996 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.581013 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.581028 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.581044 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.581061 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.581078 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.581094 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.581111 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.581128 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.581148 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.581163 4698 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.581179 4698 reconstruct.go:97] "Volume reconstruction finished" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.581189 4698 reconciler.go:26] "Reconciler: start to sync state" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.592623 4698 manager.go:324] Recovery completed Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.605570 4698 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.610764 4698 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.610955 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.610981 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.611022 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.612803 4698 cpu_manager.go:225] "Starting CPU manager" policy="none" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.612835 4698 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.612865 4698 state_mem.go:36] "Initialized new in-memory state store" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.613389 4698 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.613441 4698 status_manager.go:217] "Starting to sync pod status with apiserver" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.613472 4698 kubelet.go:2335] "Starting kubelet main sync loop" Feb 24 10:16:15 crc kubenswrapper[4698]: E0224 10:16:15.613541 4698 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Feb 24 10:16:15 crc kubenswrapper[4698]: W0224 10:16:15.615068 4698 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.65:6443: connect: connection refused Feb 24 10:16:15 crc kubenswrapper[4698]: E0224 10:16:15.615195 4698 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.65:6443: connect: connection refused" logger="UnhandledError" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.646584 4698 policy_none.go:49] "None policy: Start" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.647681 4698 memory_manager.go:170] "Starting memorymanager" policy="None" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.647756 4698 state_mem.go:35] "Initializing new in-memory state store" Feb 24 10:16:15 crc kubenswrapper[4698]: E0224 10:16:15.656200 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.705683 4698 manager.go:334] "Starting Device Plugin manager" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.705811 4698 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.705825 4698 server.go:79] "Starting device plugin registration server" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.706251 4698 eviction_manager.go:189] "Eviction manager: starting control loop" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.706318 4698 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.706420 4698 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.706549 4698 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.706571 4698 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Feb 24 10:16:15 crc kubenswrapper[4698]: E0224 10:16:15.713521 4698 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.714605 4698 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.714695 4698 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.715562 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.715589 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.715599 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.715741 4698 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.715904 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.715957 4698 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.716448 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.716473 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.716526 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.716637 4698 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.716762 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.716793 4698 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.716820 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.716859 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.716875 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.717315 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.717333 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.717340 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.717428 4698 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.717560 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.717609 4698 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.717734 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.717749 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.717757 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.718542 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.718568 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.718580 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.718584 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.718615 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.718633 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.718778 4698 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.718885 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.718911 4698 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.719590 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.719606 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.719620 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.719637 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.719660 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.719671 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.719859 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.719889 4698 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.721011 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.721031 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.721041 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:16:15 crc kubenswrapper[4698]: E0224 10:16:15.759359 4698 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.65:6443: connect: connection refused" interval="400ms" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.782210 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.782248 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.782293 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.782314 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.782333 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.782352 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.782371 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.782391 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.782429 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.782453 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.782474 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.782497 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.782519 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.782539 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.782558 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.807385 4698 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.808914 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.809585 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.809599 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.809621 4698 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 24 10:16:15 crc kubenswrapper[4698]: E0224 10:16:15.810166 4698 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.65:6443: connect: connection refused" node="crc" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.884413 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.884500 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.884552 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.884571 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.884604 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.884641 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.884615 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.884678 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.884679 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.884641 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.884828 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.884858 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.884919 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.884937 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.884962 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.884942 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.885024 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.885047 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.885105 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.885146 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.885161 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.885182 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.885185 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.885218 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.885248 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.885307 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.885330 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.885367 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.885396 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 24 10:16:15 crc kubenswrapper[4698]: I0224 10:16:15.885472 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 24 10:16:16 crc kubenswrapper[4698]: I0224 10:16:16.011003 4698 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:16:16 crc kubenswrapper[4698]: I0224 10:16:16.012329 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:16:16 crc kubenswrapper[4698]: I0224 10:16:16.012390 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:16:16 crc kubenswrapper[4698]: I0224 10:16:16.012414 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:16:16 crc kubenswrapper[4698]: I0224 10:16:16.012451 4698 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 24 10:16:16 crc kubenswrapper[4698]: E0224 10:16:16.012982 4698 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.65:6443: connect: connection refused" node="crc" Feb 24 10:16:16 crc kubenswrapper[4698]: I0224 10:16:16.048779 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Feb 24 10:16:16 crc kubenswrapper[4698]: I0224 10:16:16.075546 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 10:16:16 crc kubenswrapper[4698]: I0224 10:16:16.084589 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 24 10:16:16 crc kubenswrapper[4698]: W0224 10:16:16.097287 4698 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-e67aeba509a45e592450df9236a8e40b863b79888e251a9e39756a4c0afa7433 WatchSource:0}: Error finding container e67aeba509a45e592450df9236a8e40b863b79888e251a9e39756a4c0afa7433: Status 404 returned error can't find the container with id e67aeba509a45e592450df9236a8e40b863b79888e251a9e39756a4c0afa7433 Feb 24 10:16:16 crc kubenswrapper[4698]: W0224 10:16:16.114507 4698 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-09e0397ce0a499dbaad45b5acfc9f732f4ef4de58c4bf2507c8cc09487292bc5 WatchSource:0}: Error finding container 09e0397ce0a499dbaad45b5acfc9f732f4ef4de58c4bf2507c8cc09487292bc5: Status 404 returned error can't find the container with id 09e0397ce0a499dbaad45b5acfc9f732f4ef4de58c4bf2507c8cc09487292bc5 Feb 24 10:16:16 crc kubenswrapper[4698]: W0224 10:16:16.115731 4698 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-855a067caeec7cf7fc7c2265e118e806fcce12787df4a438206ae321cdad7f6e WatchSource:0}: Error finding container 855a067caeec7cf7fc7c2265e118e806fcce12787df4a438206ae321cdad7f6e: Status 404 returned error can't find the container with id 855a067caeec7cf7fc7c2265e118e806fcce12787df4a438206ae321cdad7f6e Feb 24 10:16:16 crc kubenswrapper[4698]: I0224 10:16:16.129925 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 24 10:16:16 crc kubenswrapper[4698]: I0224 10:16:16.141192 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 24 10:16:16 crc kubenswrapper[4698]: W0224 10:16:16.147637 4698 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-967e50451f53f97b99582d01f1c0476b10c5bd93e45dd4251fc89571abed0f17 WatchSource:0}: Error finding container 967e50451f53f97b99582d01f1c0476b10c5bd93e45dd4251fc89571abed0f17: Status 404 returned error can't find the container with id 967e50451f53f97b99582d01f1c0476b10c5bd93e45dd4251fc89571abed0f17 Feb 24 10:16:16 crc kubenswrapper[4698]: W0224 10:16:16.152621 4698 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-fc5d1dc997f4777e92f2a202cb3d9a2c92adcf03aa63e192548123dabfa35a66 WatchSource:0}: Error finding container fc5d1dc997f4777e92f2a202cb3d9a2c92adcf03aa63e192548123dabfa35a66: Status 404 returned error can't find the container with id fc5d1dc997f4777e92f2a202cb3d9a2c92adcf03aa63e192548123dabfa35a66 Feb 24 10:16:16 crc kubenswrapper[4698]: E0224 10:16:16.160145 4698 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.65:6443: connect: connection refused" interval="800ms" Feb 24 10:16:16 crc kubenswrapper[4698]: I0224 10:16:16.413915 4698 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:16:16 crc kubenswrapper[4698]: I0224 10:16:16.415575 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:16:16 crc kubenswrapper[4698]: I0224 10:16:16.415622 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:16:16 crc kubenswrapper[4698]: I0224 10:16:16.415634 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:16:16 crc kubenswrapper[4698]: I0224 10:16:16.415660 4698 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 24 10:16:16 crc kubenswrapper[4698]: E0224 10:16:16.416083 4698 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.65:6443: connect: connection refused" node="crc" Feb 24 10:16:16 crc kubenswrapper[4698]: W0224 10:16:16.544326 4698 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.65:6443: connect: connection refused Feb 24 10:16:16 crc kubenswrapper[4698]: E0224 10:16:16.544405 4698 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.65:6443: connect: connection refused" logger="UnhandledError" Feb 24 10:16:16 crc kubenswrapper[4698]: I0224 10:16:16.553454 4698 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.65:6443: connect: connection refused Feb 24 10:16:16 crc kubenswrapper[4698]: I0224 10:16:16.620592 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"fc5d1dc997f4777e92f2a202cb3d9a2c92adcf03aa63e192548123dabfa35a66"} Feb 24 10:16:16 crc kubenswrapper[4698]: I0224 10:16:16.621646 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"967e50451f53f97b99582d01f1c0476b10c5bd93e45dd4251fc89571abed0f17"} Feb 24 10:16:16 crc kubenswrapper[4698]: I0224 10:16:16.622485 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"855a067caeec7cf7fc7c2265e118e806fcce12787df4a438206ae321cdad7f6e"} Feb 24 10:16:16 crc kubenswrapper[4698]: I0224 10:16:16.623240 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"09e0397ce0a499dbaad45b5acfc9f732f4ef4de58c4bf2507c8cc09487292bc5"} Feb 24 10:16:16 crc kubenswrapper[4698]: I0224 10:16:16.624228 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"e67aeba509a45e592450df9236a8e40b863b79888e251a9e39756a4c0afa7433"} Feb 24 10:16:16 crc kubenswrapper[4698]: W0224 10:16:16.745850 4698 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.65:6443: connect: connection refused Feb 24 10:16:16 crc kubenswrapper[4698]: E0224 10:16:16.745920 4698 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.65:6443: connect: connection refused" logger="UnhandledError" Feb 24 10:16:16 crc kubenswrapper[4698]: E0224 10:16:16.961596 4698 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.65:6443: connect: connection refused" interval="1.6s" Feb 24 10:16:17 crc kubenswrapper[4698]: W0224 10:16:17.144925 4698 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.65:6443: connect: connection refused Feb 24 10:16:17 crc kubenswrapper[4698]: E0224 10:16:17.145029 4698 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.65:6443: connect: connection refused" logger="UnhandledError" Feb 24 10:16:17 crc kubenswrapper[4698]: W0224 10:16:17.148351 4698 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.65:6443: connect: connection refused Feb 24 10:16:17 crc kubenswrapper[4698]: E0224 10:16:17.148399 4698 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.65:6443: connect: connection refused" logger="UnhandledError" Feb 24 10:16:17 crc kubenswrapper[4698]: I0224 10:16:17.217147 4698 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:16:17 crc kubenswrapper[4698]: I0224 10:16:17.218325 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:16:17 crc kubenswrapper[4698]: I0224 10:16:17.218379 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:16:17 crc kubenswrapper[4698]: I0224 10:16:17.218393 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:16:17 crc kubenswrapper[4698]: I0224 10:16:17.218424 4698 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 24 10:16:17 crc kubenswrapper[4698]: E0224 10:16:17.218920 4698 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.65:6443: connect: connection refused" node="crc" Feb 24 10:16:17 crc kubenswrapper[4698]: I0224 10:16:17.468616 4698 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 24 10:16:17 crc kubenswrapper[4698]: E0224 10:16:17.469735 4698 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.65:6443: connect: connection refused" logger="UnhandledError" Feb 24 10:16:17 crc kubenswrapper[4698]: I0224 10:16:17.553334 4698 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.65:6443: connect: connection refused Feb 24 10:16:17 crc kubenswrapper[4698]: I0224 10:16:17.630333 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b45bc6035a33d5e9841bd5791aeb2521dd1f93616396be15bef77dc6f5af97cc"} Feb 24 10:16:17 crc kubenswrapper[4698]: I0224 10:16:17.630385 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b9ee3d8391b55fa37cff72ad555ec89f4b12b8b5ef765979d929da0ae7cbb052"} Feb 24 10:16:17 crc kubenswrapper[4698]: I0224 10:16:17.630399 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"686acda68f64175c520efc4054df6bcfd32b2c98a3d8134d32e252d265520338"} Feb 24 10:16:17 crc kubenswrapper[4698]: I0224 10:16:17.630426 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"e0a9191217045254bf454800fc32d325cc4450d0d4d0d9b6fb4bd6a438872cd9"} Feb 24 10:16:17 crc kubenswrapper[4698]: I0224 10:16:17.630425 4698 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:16:17 crc kubenswrapper[4698]: I0224 10:16:17.631717 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:16:17 crc kubenswrapper[4698]: I0224 10:16:17.631767 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:16:17 crc kubenswrapper[4698]: I0224 10:16:17.631785 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:16:17 crc kubenswrapper[4698]: I0224 10:16:17.633122 4698 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c9e1b116db9c76dec99d1ac4af98e5ee081f2a171a19093ba5628b676356f34b" exitCode=0 Feb 24 10:16:17 crc kubenswrapper[4698]: I0224 10:16:17.633308 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"c9e1b116db9c76dec99d1ac4af98e5ee081f2a171a19093ba5628b676356f34b"} Feb 24 10:16:17 crc kubenswrapper[4698]: I0224 10:16:17.633317 4698 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:16:17 crc kubenswrapper[4698]: I0224 10:16:17.636688 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:16:17 crc kubenswrapper[4698]: I0224 10:16:17.636993 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:16:17 crc kubenswrapper[4698]: I0224 10:16:17.637039 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:16:17 crc kubenswrapper[4698]: I0224 10:16:17.639784 4698 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:16:17 crc kubenswrapper[4698]: I0224 10:16:17.641090 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:16:17 crc kubenswrapper[4698]: I0224 10:16:17.641335 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:16:17 crc kubenswrapper[4698]: I0224 10:16:17.641505 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:16:17 crc kubenswrapper[4698]: I0224 10:16:17.655789 4698 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="96955e70c81698ab59580428c999d2bc6a50b712c569961169488e58f1702878" exitCode=0 Feb 24 10:16:17 crc kubenswrapper[4698]: I0224 10:16:17.655903 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"96955e70c81698ab59580428c999d2bc6a50b712c569961169488e58f1702878"} Feb 24 10:16:17 crc kubenswrapper[4698]: I0224 10:16:17.656006 4698 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:16:17 crc kubenswrapper[4698]: I0224 10:16:17.656977 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:16:17 crc kubenswrapper[4698]: I0224 10:16:17.657011 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:16:17 crc kubenswrapper[4698]: I0224 10:16:17.657025 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:16:17 crc kubenswrapper[4698]: I0224 10:16:17.657788 4698 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="3db333a522e2af0e76373f531a9cc9402893673f8c22483e8ca704df67d0609a" exitCode=0 Feb 24 10:16:17 crc kubenswrapper[4698]: I0224 10:16:17.657884 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"3db333a522e2af0e76373f531a9cc9402893673f8c22483e8ca704df67d0609a"} Feb 24 10:16:17 crc kubenswrapper[4698]: I0224 10:16:17.657905 4698 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:16:17 crc kubenswrapper[4698]: I0224 10:16:17.659993 4698 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="cadab6d7d113b12b60104344c27a04acf451f6627c3d62ad17b9132d63b6e974" exitCode=0 Feb 24 10:16:17 crc kubenswrapper[4698]: I0224 10:16:17.660029 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"cadab6d7d113b12b60104344c27a04acf451f6627c3d62ad17b9132d63b6e974"} Feb 24 10:16:17 crc kubenswrapper[4698]: I0224 10:16:17.660075 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:16:17 crc kubenswrapper[4698]: I0224 10:16:17.660132 4698 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:16:17 crc kubenswrapper[4698]: I0224 10:16:17.660175 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:16:17 crc kubenswrapper[4698]: I0224 10:16:17.660194 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:16:17 crc kubenswrapper[4698]: I0224 10:16:17.661029 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:16:17 crc kubenswrapper[4698]: I0224 10:16:17.661120 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:16:17 crc kubenswrapper[4698]: I0224 10:16:17.661180 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:16:18 crc kubenswrapper[4698]: W0224 10:16:18.360119 4698 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.65:6443: connect: connection refused Feb 24 10:16:18 crc kubenswrapper[4698]: E0224 10:16:18.360428 4698 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.65:6443: connect: connection refused" logger="UnhandledError" Feb 24 10:16:18 crc kubenswrapper[4698]: I0224 10:16:18.553398 4698 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.65:6443: connect: connection refused Feb 24 10:16:18 crc kubenswrapper[4698]: E0224 10:16:18.563092 4698 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.65:6443: connect: connection refused" interval="3.2s" Feb 24 10:16:18 crc kubenswrapper[4698]: I0224 10:16:18.663390 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"e8674ef7a05be4a5cea669be15576bd374fdae8be9d8345ec58fc2e9b2e15d07"} Feb 24 10:16:18 crc kubenswrapper[4698]: I0224 10:16:18.663419 4698 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:16:18 crc kubenswrapper[4698]: I0224 10:16:18.664352 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:16:18 crc kubenswrapper[4698]: I0224 10:16:18.664383 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:16:18 crc kubenswrapper[4698]: I0224 10:16:18.664392 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:16:18 crc kubenswrapper[4698]: I0224 10:16:18.665133 4698 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:16:18 crc kubenswrapper[4698]: I0224 10:16:18.665089 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"532e8024f6ab49ca211330f56da50af0f46daf4569ad97723d35aa97076cde4d"} Feb 24 10:16:18 crc kubenswrapper[4698]: I0224 10:16:18.665178 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"29426bbee48683a1da8ffc61612543b337ccf61119a3617bcbbb475f75dac606"} Feb 24 10:16:18 crc kubenswrapper[4698]: I0224 10:16:18.665195 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"2be309f6fc6bdf6f229b4a6ee32621f1385e3addb1c6655f4ee94a9e0f07e7e7"} Feb 24 10:16:18 crc kubenswrapper[4698]: I0224 10:16:18.665783 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:16:18 crc kubenswrapper[4698]: I0224 10:16:18.665808 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:16:18 crc kubenswrapper[4698]: I0224 10:16:18.665820 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:16:18 crc kubenswrapper[4698]: I0224 10:16:18.667523 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c1c9fccf4ca0b7e5edd88e8a36660290b4a174c56feabb2224b0474514180f03"} Feb 24 10:16:18 crc kubenswrapper[4698]: I0224 10:16:18.667549 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"674ed085a7507742c61fdb7dae4678b08e315a3679788c5dcbb4df97cdc27c61"} Feb 24 10:16:18 crc kubenswrapper[4698]: I0224 10:16:18.667562 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a42a2655047e1fb057b615781d8c2ccf50f62f2a70749ef8bb214d32edaba2b1"} Feb 24 10:16:18 crc kubenswrapper[4698]: I0224 10:16:18.667575 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"7e1bb75600de7e41c8a04ba010078c753b55d05aae7a18f945c2027ba48ee30c"} Feb 24 10:16:18 crc kubenswrapper[4698]: I0224 10:16:18.667586 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"6b9d9ca2f4ccd094b55e3e27cef8afddae5dc7de81912aba64ca6a6671f14a35"} Feb 24 10:16:18 crc kubenswrapper[4698]: I0224 10:16:18.667738 4698 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:16:18 crc kubenswrapper[4698]: I0224 10:16:18.668544 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:16:18 crc kubenswrapper[4698]: I0224 10:16:18.668568 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:16:18 crc kubenswrapper[4698]: I0224 10:16:18.668578 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:16:18 crc kubenswrapper[4698]: I0224 10:16:18.669966 4698 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="58ac6f358ab1233bc8d572a403f57ad949ec6e10df4c56b9d4b535362a0f639e" exitCode=0 Feb 24 10:16:18 crc kubenswrapper[4698]: I0224 10:16:18.670049 4698 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:16:18 crc kubenswrapper[4698]: I0224 10:16:18.673473 4698 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:16:18 crc kubenswrapper[4698]: I0224 10:16:18.673450 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"58ac6f358ab1233bc8d572a403f57ad949ec6e10df4c56b9d4b535362a0f639e"} Feb 24 10:16:18 crc kubenswrapper[4698]: I0224 10:16:18.674775 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:16:18 crc kubenswrapper[4698]: I0224 10:16:18.674822 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:16:18 crc kubenswrapper[4698]: I0224 10:16:18.674836 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:16:18 crc kubenswrapper[4698]: I0224 10:16:18.675487 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:16:18 crc kubenswrapper[4698]: I0224 10:16:18.675525 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:16:18 crc kubenswrapper[4698]: I0224 10:16:18.675540 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:16:18 crc kubenswrapper[4698]: I0224 10:16:18.819846 4698 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:16:18 crc kubenswrapper[4698]: I0224 10:16:18.821188 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:16:18 crc kubenswrapper[4698]: I0224 10:16:18.821221 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:16:18 crc kubenswrapper[4698]: I0224 10:16:18.821230 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:16:18 crc kubenswrapper[4698]: I0224 10:16:18.821255 4698 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 24 10:16:18 crc kubenswrapper[4698]: E0224 10:16:18.821671 4698 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.65:6443: connect: connection refused" node="crc" Feb 24 10:16:19 crc kubenswrapper[4698]: I0224 10:16:19.018009 4698 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 10:16:19 crc kubenswrapper[4698]: I0224 10:16:19.018404 4698 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="Get \"https://192.168.126.11:6443/livez\": dial tcp 192.168.126.11:6443: connect: connection refused" start-of-body= Feb 24 10:16:19 crc kubenswrapper[4698]: I0224 10:16:19.018481 4698 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="Get \"https://192.168.126.11:6443/livez\": dial tcp 192.168.126.11:6443: connect: connection refused" Feb 24 10:16:19 crc kubenswrapper[4698]: W0224 10:16:19.149872 4698 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.65:6443: connect: connection refused Feb 24 10:16:19 crc kubenswrapper[4698]: E0224 10:16:19.150005 4698 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.65:6443: connect: connection refused" logger="UnhandledError" Feb 24 10:16:19 crc kubenswrapper[4698]: I0224 10:16:19.675206 4698 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="6cba8547d85b5fb437d90d455b42654df8d8663b592dee40a0982427c2f98547" exitCode=0 Feb 24 10:16:19 crc kubenswrapper[4698]: I0224 10:16:19.675322 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"6cba8547d85b5fb437d90d455b42654df8d8663b592dee40a0982427c2f98547"} Feb 24 10:16:19 crc kubenswrapper[4698]: I0224 10:16:19.675461 4698 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:16:19 crc kubenswrapper[4698]: I0224 10:16:19.676445 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:16:19 crc kubenswrapper[4698]: I0224 10:16:19.676477 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:16:19 crc kubenswrapper[4698]: I0224 10:16:19.676489 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:16:19 crc kubenswrapper[4698]: I0224 10:16:19.681174 4698 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 24 10:16:19 crc kubenswrapper[4698]: I0224 10:16:19.683602 4698 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c1c9fccf4ca0b7e5edd88e8a36660290b4a174c56feabb2224b0474514180f03" exitCode=255 Feb 24 10:16:19 crc kubenswrapper[4698]: I0224 10:16:19.683693 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"c1c9fccf4ca0b7e5edd88e8a36660290b4a174c56feabb2224b0474514180f03"} Feb 24 10:16:19 crc kubenswrapper[4698]: I0224 10:16:19.683785 4698 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:16:19 crc kubenswrapper[4698]: I0224 10:16:19.683933 4698 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:16:19 crc kubenswrapper[4698]: I0224 10:16:19.684121 4698 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:16:19 crc kubenswrapper[4698]: I0224 10:16:19.684705 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 24 10:16:19 crc kubenswrapper[4698]: I0224 10:16:19.685147 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:16:19 crc kubenswrapper[4698]: I0224 10:16:19.685208 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:16:19 crc kubenswrapper[4698]: I0224 10:16:19.685232 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:16:19 crc kubenswrapper[4698]: I0224 10:16:19.685919 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:16:19 crc kubenswrapper[4698]: I0224 10:16:19.686046 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:16:19 crc kubenswrapper[4698]: I0224 10:16:19.686151 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:16:19 crc kubenswrapper[4698]: I0224 10:16:19.686299 4698 scope.go:117] "RemoveContainer" containerID="c1c9fccf4ca0b7e5edd88e8a36660290b4a174c56feabb2224b0474514180f03" Feb 24 10:16:19 crc kubenswrapper[4698]: I0224 10:16:19.686915 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:16:19 crc kubenswrapper[4698]: I0224 10:16:19.686965 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:16:19 crc kubenswrapper[4698]: I0224 10:16:19.686980 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:16:20 crc kubenswrapper[4698]: I0224 10:16:20.687666 4698 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 24 10:16:20 crc kubenswrapper[4698]: I0224 10:16:20.689946 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"7722405f18fa1aa0959f68b39e76fac95bcdac9aaf8ee95837e12a4e0953c45e"} Feb 24 10:16:20 crc kubenswrapper[4698]: I0224 10:16:20.690131 4698 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:16:20 crc kubenswrapper[4698]: I0224 10:16:20.690382 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 10:16:20 crc kubenswrapper[4698]: I0224 10:16:20.691008 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:16:20 crc kubenswrapper[4698]: I0224 10:16:20.691036 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:16:20 crc kubenswrapper[4698]: I0224 10:16:20.691049 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:16:20 crc kubenswrapper[4698]: I0224 10:16:20.695820 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"a06990c16f9a0312f24771d4bfbbedeebbf5063afb8daaccfc4d17f60d641f5f"} Feb 24 10:16:20 crc kubenswrapper[4698]: I0224 10:16:20.695893 4698 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:16:20 crc kubenswrapper[4698]: I0224 10:16:20.695892 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"1ce842ec12984cffb63c49d9c2964440e503b1225036922d25e238b978b26130"} Feb 24 10:16:20 crc kubenswrapper[4698]: I0224 10:16:20.696329 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"69d7559f437e1a17b2ab3498c72ef428df69dfcc6827f78dd1edbc4a8251b5f1"} Feb 24 10:16:20 crc kubenswrapper[4698]: I0224 10:16:20.696755 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:16:20 crc kubenswrapper[4698]: I0224 10:16:20.696793 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:16:20 crc kubenswrapper[4698]: I0224 10:16:20.696808 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:16:21 crc kubenswrapper[4698]: I0224 10:16:21.573124 4698 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 24 10:16:21 crc kubenswrapper[4698]: I0224 10:16:21.708451 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"c63acc4dd56ca511d6de2f69a1f60dc53516cf4883c0355e1de373ae7fe0807f"} Feb 24 10:16:21 crc kubenswrapper[4698]: I0224 10:16:21.708516 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"1349d6c8aff311d876b61e13793a708952def1ba52ba669fcf8a99b27ba7db5c"} Feb 24 10:16:21 crc kubenswrapper[4698]: I0224 10:16:21.708587 4698 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:16:21 crc kubenswrapper[4698]: I0224 10:16:21.708653 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 10:16:21 crc kubenswrapper[4698]: I0224 10:16:21.708719 4698 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:16:21 crc kubenswrapper[4698]: I0224 10:16:21.709917 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:16:21 crc kubenswrapper[4698]: I0224 10:16:21.709963 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:16:21 crc kubenswrapper[4698]: I0224 10:16:21.709977 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:16:21 crc kubenswrapper[4698]: I0224 10:16:21.710478 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:16:21 crc kubenswrapper[4698]: I0224 10:16:21.710511 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:16:21 crc kubenswrapper[4698]: I0224 10:16:21.710520 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:16:22 crc kubenswrapper[4698]: I0224 10:16:22.022556 4698 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:16:22 crc kubenswrapper[4698]: I0224 10:16:22.024541 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:16:22 crc kubenswrapper[4698]: I0224 10:16:22.024613 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:16:22 crc kubenswrapper[4698]: I0224 10:16:22.024635 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:16:22 crc kubenswrapper[4698]: I0224 10:16:22.024679 4698 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 24 10:16:22 crc kubenswrapper[4698]: I0224 10:16:22.461998 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 24 10:16:22 crc kubenswrapper[4698]: I0224 10:16:22.462156 4698 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:16:22 crc kubenswrapper[4698]: I0224 10:16:22.463240 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:16:22 crc kubenswrapper[4698]: I0224 10:16:22.463290 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:16:22 crc kubenswrapper[4698]: I0224 10:16:22.463304 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:16:22 crc kubenswrapper[4698]: I0224 10:16:22.711615 4698 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:16:22 crc kubenswrapper[4698]: I0224 10:16:22.711620 4698 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:16:22 crc kubenswrapper[4698]: I0224 10:16:22.713013 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:16:22 crc kubenswrapper[4698]: I0224 10:16:22.713077 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:16:22 crc kubenswrapper[4698]: I0224 10:16:22.713095 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:16:22 crc kubenswrapper[4698]: I0224 10:16:22.713984 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:16:22 crc kubenswrapper[4698]: I0224 10:16:22.714031 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:16:22 crc kubenswrapper[4698]: I0224 10:16:22.714045 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:16:22 crc kubenswrapper[4698]: I0224 10:16:22.889630 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 10:16:23 crc kubenswrapper[4698]: I0224 10:16:23.682929 4698 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 24 10:16:23 crc kubenswrapper[4698]: I0224 10:16:23.683167 4698 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:16:23 crc kubenswrapper[4698]: I0224 10:16:23.684763 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:16:23 crc kubenswrapper[4698]: I0224 10:16:23.684809 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:16:23 crc kubenswrapper[4698]: I0224 10:16:23.684821 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:16:23 crc kubenswrapper[4698]: I0224 10:16:23.714337 4698 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:16:23 crc kubenswrapper[4698]: I0224 10:16:23.715294 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:16:23 crc kubenswrapper[4698]: I0224 10:16:23.715328 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:16:23 crc kubenswrapper[4698]: I0224 10:16:23.715340 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:16:25 crc kubenswrapper[4698]: I0224 10:16:25.106741 4698 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 24 10:16:25 crc kubenswrapper[4698]: I0224 10:16:25.107657 4698 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:16:25 crc kubenswrapper[4698]: I0224 10:16:25.109123 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:16:25 crc kubenswrapper[4698]: I0224 10:16:25.109162 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:16:25 crc kubenswrapper[4698]: I0224 10:16:25.109188 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:16:25 crc kubenswrapper[4698]: I0224 10:16:25.114406 4698 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 24 10:16:25 crc kubenswrapper[4698]: E0224 10:16:25.713747 4698 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 24 10:16:25 crc kubenswrapper[4698]: I0224 10:16:25.718661 4698 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:16:25 crc kubenswrapper[4698]: I0224 10:16:25.718762 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 24 10:16:25 crc kubenswrapper[4698]: I0224 10:16:25.720083 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:16:25 crc kubenswrapper[4698]: I0224 10:16:25.720204 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:16:25 crc kubenswrapper[4698]: I0224 10:16:25.720303 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:16:25 crc kubenswrapper[4698]: I0224 10:16:25.927741 4698 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Feb 24 10:16:25 crc kubenswrapper[4698]: I0224 10:16:25.927946 4698 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:16:25 crc kubenswrapper[4698]: I0224 10:16:25.929451 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:16:25 crc kubenswrapper[4698]: I0224 10:16:25.929478 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:16:25 crc kubenswrapper[4698]: I0224 10:16:25.929486 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:16:26 crc kubenswrapper[4698]: I0224 10:16:26.105451 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Feb 24 10:16:26 crc kubenswrapper[4698]: I0224 10:16:26.167922 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 24 10:16:26 crc kubenswrapper[4698]: I0224 10:16:26.683890 4698 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 24 10:16:26 crc kubenswrapper[4698]: I0224 10:16:26.683978 4698 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 24 10:16:26 crc kubenswrapper[4698]: I0224 10:16:26.721213 4698 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:16:26 crc kubenswrapper[4698]: I0224 10:16:26.721213 4698 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:16:26 crc kubenswrapper[4698]: I0224 10:16:26.722438 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:16:26 crc kubenswrapper[4698]: I0224 10:16:26.722493 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:16:26 crc kubenswrapper[4698]: I0224 10:16:26.722510 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:16:26 crc kubenswrapper[4698]: I0224 10:16:26.722993 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:16:26 crc kubenswrapper[4698]: I0224 10:16:26.723034 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:16:26 crc kubenswrapper[4698]: I0224 10:16:26.723048 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:16:27 crc kubenswrapper[4698]: I0224 10:16:27.723794 4698 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:16:27 crc kubenswrapper[4698]: I0224 10:16:27.725212 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:16:27 crc kubenswrapper[4698]: I0224 10:16:27.725313 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:16:27 crc kubenswrapper[4698]: I0224 10:16:27.725334 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:16:29 crc kubenswrapper[4698]: W0224 10:16:29.353228 4698 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Feb 24 10:16:29 crc kubenswrapper[4698]: I0224 10:16:29.353453 4698 trace.go:236] Trace[811968531]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (24-Feb-2026 10:16:19.351) (total time: 10001ms): Feb 24 10:16:29 crc kubenswrapper[4698]: Trace[811968531]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (10:16:29.353) Feb 24 10:16:29 crc kubenswrapper[4698]: Trace[811968531]: [10.001583813s] [10.001583813s] END Feb 24 10:16:29 crc kubenswrapper[4698]: E0224 10:16:29.353491 4698 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Feb 24 10:16:29 crc kubenswrapper[4698]: I0224 10:16:29.554787 4698 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Feb 24 10:16:29 crc kubenswrapper[4698]: E0224 10:16:29.869889 4698 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": net/http: TLS handshake timeout" event="&Event{ObjectMeta:{crc.189727499f412904 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:15.551523076 +0000 UTC m=+0.665137317,LastTimestamp:2026-02-24 10:16:15.551523076 +0000 UTC m=+0.665137317,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:30 crc kubenswrapper[4698]: W0224 10:16:30.080249 4698 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Feb 24 10:16:30 crc kubenswrapper[4698]: I0224 10:16:30.080407 4698 trace.go:236] Trace[1991506834]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (24-Feb-2026 10:16:20.078) (total time: 10001ms): Feb 24 10:16:30 crc kubenswrapper[4698]: Trace[1991506834]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (10:16:30.080) Feb 24 10:16:30 crc kubenswrapper[4698]: Trace[1991506834]: [10.001741536s] [10.001741536s] END Feb 24 10:16:30 crc kubenswrapper[4698]: E0224 10:16:30.080442 4698 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Feb 24 10:16:31 crc kubenswrapper[4698]: I0224 10:16:31.051117 4698 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:16:31Z is after 2026-02-23T05:33:13Z Feb 24 10:16:31 crc kubenswrapper[4698]: W0224 10:16:31.051815 4698 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:16:31Z is after 2026-02-23T05:33:13Z Feb 24 10:16:31 crc kubenswrapper[4698]: E0224 10:16:31.051933 4698 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:16:31Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 24 10:16:31 crc kubenswrapper[4698]: I0224 10:16:31.052472 4698 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 24 10:16:31 crc kubenswrapper[4698]: I0224 10:16:31.052576 4698 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Feb 24 10:16:31 crc kubenswrapper[4698]: W0224 10:16:31.053928 4698 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:16:31Z is after 2026-02-23T05:33:13Z Feb 24 10:16:31 crc kubenswrapper[4698]: E0224 10:16:31.054025 4698 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:16:31Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 24 10:16:31 crc kubenswrapper[4698]: E0224 10:16:31.060482 4698 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:16:31Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 24 10:16:31 crc kubenswrapper[4698]: E0224 10:16:31.060894 4698 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:16:31Z is after 2026-02-23T05:33:13Z" interval="6.4s" Feb 24 10:16:31 crc kubenswrapper[4698]: I0224 10:16:31.061076 4698 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 24 10:16:31 crc kubenswrapper[4698]: I0224 10:16:31.061172 4698 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Feb 24 10:16:31 crc kubenswrapper[4698]: E0224 10:16:31.063134 4698 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:16:31Z is after 2026-02-23T05:33:13Z" node="crc" Feb 24 10:16:31 crc kubenswrapper[4698]: I0224 10:16:31.558438 4698 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:16:31Z is after 2026-02-23T05:33:13Z Feb 24 10:16:31 crc kubenswrapper[4698]: I0224 10:16:31.737011 4698 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 24 10:16:31 crc kubenswrapper[4698]: I0224 10:16:31.737925 4698 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 24 10:16:31 crc kubenswrapper[4698]: I0224 10:16:31.739935 4698 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="7722405f18fa1aa0959f68b39e76fac95bcdac9aaf8ee95837e12a4e0953c45e" exitCode=255 Feb 24 10:16:31 crc kubenswrapper[4698]: I0224 10:16:31.739979 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"7722405f18fa1aa0959f68b39e76fac95bcdac9aaf8ee95837e12a4e0953c45e"} Feb 24 10:16:31 crc kubenswrapper[4698]: I0224 10:16:31.740042 4698 scope.go:117] "RemoveContainer" containerID="c1c9fccf4ca0b7e5edd88e8a36660290b4a174c56feabb2224b0474514180f03" Feb 24 10:16:31 crc kubenswrapper[4698]: I0224 10:16:31.740143 4698 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:16:31 crc kubenswrapper[4698]: I0224 10:16:31.741020 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:16:31 crc kubenswrapper[4698]: I0224 10:16:31.741048 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:16:31 crc kubenswrapper[4698]: I0224 10:16:31.741057 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:16:31 crc kubenswrapper[4698]: I0224 10:16:31.741517 4698 scope.go:117] "RemoveContainer" containerID="7722405f18fa1aa0959f68b39e76fac95bcdac9aaf8ee95837e12a4e0953c45e" Feb 24 10:16:31 crc kubenswrapper[4698]: E0224 10:16:31.741661 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 24 10:16:32 crc kubenswrapper[4698]: I0224 10:16:32.338730 4698 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 10:16:32 crc kubenswrapper[4698]: I0224 10:16:32.557734 4698 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:16:32Z is after 2026-02-23T05:33:13Z Feb 24 10:16:32 crc kubenswrapper[4698]: I0224 10:16:32.744696 4698 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 24 10:16:32 crc kubenswrapper[4698]: I0224 10:16:32.748076 4698 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:16:32 crc kubenswrapper[4698]: I0224 10:16:32.753101 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:16:32 crc kubenswrapper[4698]: I0224 10:16:32.753194 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:16:32 crc kubenswrapper[4698]: I0224 10:16:32.753225 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:16:32 crc kubenswrapper[4698]: I0224 10:16:32.754425 4698 scope.go:117] "RemoveContainer" containerID="7722405f18fa1aa0959f68b39e76fac95bcdac9aaf8ee95837e12a4e0953c45e" Feb 24 10:16:32 crc kubenswrapper[4698]: E0224 10:16:32.754867 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 24 10:16:33 crc kubenswrapper[4698]: W0224 10:16:33.164957 4698 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:16:33Z is after 2026-02-23T05:33:13Z Feb 24 10:16:33 crc kubenswrapper[4698]: E0224 10:16:33.165096 4698 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:16:33Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 24 10:16:33 crc kubenswrapper[4698]: I0224 10:16:33.557938 4698 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:16:33Z is after 2026-02-23T05:33:13Z Feb 24 10:16:34 crc kubenswrapper[4698]: I0224 10:16:34.028298 4698 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 10:16:34 crc kubenswrapper[4698]: I0224 10:16:34.028566 4698 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:16:34 crc kubenswrapper[4698]: I0224 10:16:34.030832 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:16:34 crc kubenswrapper[4698]: I0224 10:16:34.030912 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:16:34 crc kubenswrapper[4698]: I0224 10:16:34.030939 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:16:34 crc kubenswrapper[4698]: I0224 10:16:34.031873 4698 scope.go:117] "RemoveContainer" containerID="7722405f18fa1aa0959f68b39e76fac95bcdac9aaf8ee95837e12a4e0953c45e" Feb 24 10:16:34 crc kubenswrapper[4698]: E0224 10:16:34.032181 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 24 10:16:34 crc kubenswrapper[4698]: I0224 10:16:34.037437 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 10:16:34 crc kubenswrapper[4698]: I0224 10:16:34.558170 4698 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:16:34Z is after 2026-02-23T05:33:13Z Feb 24 10:16:34 crc kubenswrapper[4698]: I0224 10:16:34.754567 4698 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:16:34 crc kubenswrapper[4698]: I0224 10:16:34.755989 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:16:34 crc kubenswrapper[4698]: I0224 10:16:34.756049 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:16:34 crc kubenswrapper[4698]: I0224 10:16:34.756067 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:16:34 crc kubenswrapper[4698]: I0224 10:16:34.756852 4698 scope.go:117] "RemoveContainer" containerID="7722405f18fa1aa0959f68b39e76fac95bcdac9aaf8ee95837e12a4e0953c45e" Feb 24 10:16:34 crc kubenswrapper[4698]: E0224 10:16:34.757127 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 24 10:16:35 crc kubenswrapper[4698]: I0224 10:16:35.557977 4698 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:16:35Z is after 2026-02-23T05:33:13Z Feb 24 10:16:35 crc kubenswrapper[4698]: E0224 10:16:35.713918 4698 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 24 10:16:35 crc kubenswrapper[4698]: I0224 10:16:35.962858 4698 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Feb 24 10:16:35 crc kubenswrapper[4698]: I0224 10:16:35.963152 4698 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:16:35 crc kubenswrapper[4698]: I0224 10:16:35.964636 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:16:35 crc kubenswrapper[4698]: I0224 10:16:35.964687 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:16:35 crc kubenswrapper[4698]: I0224 10:16:35.964702 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:16:35 crc kubenswrapper[4698]: I0224 10:16:35.981333 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Feb 24 10:16:36 crc kubenswrapper[4698]: W0224 10:16:36.171179 4698 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:16:36Z is after 2026-02-23T05:33:13Z Feb 24 10:16:36 crc kubenswrapper[4698]: E0224 10:16:36.171291 4698 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:16:36Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Feb 24 10:16:36 crc kubenswrapper[4698]: I0224 10:16:36.557334 4698 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:16:36Z is after 2026-02-23T05:33:13Z Feb 24 10:16:36 crc kubenswrapper[4698]: I0224 10:16:36.684362 4698 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 24 10:16:36 crc kubenswrapper[4698]: I0224 10:16:36.684524 4698 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 24 10:16:36 crc kubenswrapper[4698]: I0224 10:16:36.761380 4698 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:16:36 crc kubenswrapper[4698]: I0224 10:16:36.762541 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:16:36 crc kubenswrapper[4698]: I0224 10:16:36.762602 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:16:36 crc kubenswrapper[4698]: I0224 10:16:36.762623 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:16:37 crc kubenswrapper[4698]: I0224 10:16:37.463977 4698 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:16:37 crc kubenswrapper[4698]: I0224 10:16:37.465618 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:16:37 crc kubenswrapper[4698]: I0224 10:16:37.465676 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:16:37 crc kubenswrapper[4698]: I0224 10:16:37.465702 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:16:37 crc kubenswrapper[4698]: I0224 10:16:37.465746 4698 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 24 10:16:37 crc kubenswrapper[4698]: E0224 10:16:37.472030 4698 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Feb 24 10:16:37 crc kubenswrapper[4698]: E0224 10:16:37.472878 4698 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Feb 24 10:16:37 crc kubenswrapper[4698]: I0224 10:16:37.560885 4698 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 24 10:16:37 crc kubenswrapper[4698]: W0224 10:16:37.748673 4698 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Feb 24 10:16:37 crc kubenswrapper[4698]: E0224 10:16:37.748740 4698 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Feb 24 10:16:38 crc kubenswrapper[4698]: W0224 10:16:38.421858 4698 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Feb 24 10:16:38 crc kubenswrapper[4698]: E0224 10:16:38.421945 4698 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Feb 24 10:16:38 crc kubenswrapper[4698]: I0224 10:16:38.561036 4698 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 24 10:16:39 crc kubenswrapper[4698]: I0224 10:16:39.207601 4698 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 24 10:16:39 crc kubenswrapper[4698]: I0224 10:16:39.231434 4698 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 24 10:16:39 crc kubenswrapper[4698]: I0224 10:16:39.561378 4698 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 24 10:16:39 crc kubenswrapper[4698]: E0224 10:16:39.878081 4698 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189727499f412904 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:15.551523076 +0000 UTC m=+0.665137317,LastTimestamp:2026-02-24 10:16:15.551523076 +0000 UTC m=+0.665137317,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:39 crc kubenswrapper[4698]: E0224 10:16:39.887640 4698 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18972749a2cc4852 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:15.610972242 +0000 UTC m=+0.724586493,LastTimestamp:2026-02-24 10:16:15.610972242 +0000 UTC m=+0.724586493,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:39 crc kubenswrapper[4698]: E0224 10:16:39.895790 4698 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18972749a2ccdee9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:15.611010793 +0000 UTC m=+0.724625044,LastTimestamp:2026-02-24 10:16:15.611010793 +0000 UTC m=+0.724625044,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:39 crc kubenswrapper[4698]: E0224 10:16:39.906195 4698 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18972749a2cd277d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:15.611029373 +0000 UTC m=+0.724643624,LastTimestamp:2026-02-24 10:16:15.611029373 +0000 UTC m=+0.724643624,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:39 crc kubenswrapper[4698]: E0224 10:16:39.909653 4698 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18972749a8acb4fd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:15.709566205 +0000 UTC m=+0.823180446,LastTimestamp:2026-02-24 10:16:15.709566205 +0000 UTC m=+0.823180446,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:39 crc kubenswrapper[4698]: E0224 10:16:39.913058 4698 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18972749a2cc4852\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18972749a2cc4852 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:15.610972242 +0000 UTC m=+0.724586493,LastTimestamp:2026-02-24 10:16:15.715577932 +0000 UTC m=+0.829192173,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:39 crc kubenswrapper[4698]: E0224 10:16:39.915941 4698 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18972749a2ccdee9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18972749a2ccdee9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:15.611010793 +0000 UTC m=+0.724625044,LastTimestamp:2026-02-24 10:16:15.715595393 +0000 UTC m=+0.829209634,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:39 crc kubenswrapper[4698]: E0224 10:16:39.920984 4698 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18972749a2cd277d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18972749a2cd277d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:15.611029373 +0000 UTC m=+0.724643624,LastTimestamp:2026-02-24 10:16:15.715603733 +0000 UTC m=+0.829217974,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:39 crc kubenswrapper[4698]: E0224 10:16:39.927325 4698 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18972749a2cc4852\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18972749a2cc4852 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:15.610972242 +0000 UTC m=+0.724586493,LastTimestamp:2026-02-24 10:16:15.716464739 +0000 UTC m=+0.830078980,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:39 crc kubenswrapper[4698]: E0224 10:16:39.931799 4698 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18972749a2ccdee9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18972749a2ccdee9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:15.611010793 +0000 UTC m=+0.724625044,LastTimestamp:2026-02-24 10:16:15.716479239 +0000 UTC m=+0.830093480,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:39 crc kubenswrapper[4698]: E0224 10:16:39.938654 4698 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18972749a2cd277d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18972749a2cd277d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:15.611029373 +0000 UTC m=+0.724643624,LastTimestamp:2026-02-24 10:16:15.716531579 +0000 UTC m=+0.830145820,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:39 crc kubenswrapper[4698]: E0224 10:16:39.944668 4698 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18972749a2cc4852\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18972749a2cc4852 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:15.610972242 +0000 UTC m=+0.724586493,LastTimestamp:2026-02-24 10:16:15.716841431 +0000 UTC m=+0.830455692,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:39 crc kubenswrapper[4698]: E0224 10:16:39.949110 4698 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18972749a2ccdee9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18972749a2ccdee9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:15.611010793 +0000 UTC m=+0.724625044,LastTimestamp:2026-02-24 10:16:15.716869112 +0000 UTC m=+0.830483373,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:39 crc kubenswrapper[4698]: E0224 10:16:39.953923 4698 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18972749a2cd277d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18972749a2cd277d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:15.611029373 +0000 UTC m=+0.724643624,LastTimestamp:2026-02-24 10:16:15.716884893 +0000 UTC m=+0.830499154,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:39 crc kubenswrapper[4698]: E0224 10:16:39.957977 4698 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18972749a2cc4852\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18972749a2cc4852 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:15.610972242 +0000 UTC m=+0.724586493,LastTimestamp:2026-02-24 10:16:15.717328436 +0000 UTC m=+0.830942667,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:39 crc kubenswrapper[4698]: E0224 10:16:39.962887 4698 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18972749a2ccdee9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18972749a2ccdee9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:15.611010793 +0000 UTC m=+0.724625044,LastTimestamp:2026-02-24 10:16:15.717338046 +0000 UTC m=+0.830952287,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:39 crc kubenswrapper[4698]: E0224 10:16:39.969712 4698 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18972749a2cd277d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18972749a2cd277d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:15.611029373 +0000 UTC m=+0.724643624,LastTimestamp:2026-02-24 10:16:15.717345236 +0000 UTC m=+0.830959477,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:39 crc kubenswrapper[4698]: E0224 10:16:39.973898 4698 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18972749a2cc4852\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18972749a2cc4852 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:15.610972242 +0000 UTC m=+0.724586493,LastTimestamp:2026-02-24 10:16:15.717745579 +0000 UTC m=+0.831359820,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:39 crc kubenswrapper[4698]: E0224 10:16:39.985073 4698 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18972749a2ccdee9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18972749a2ccdee9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:15.611010793 +0000 UTC m=+0.724625044,LastTimestamp:2026-02-24 10:16:15.717754589 +0000 UTC m=+0.831368820,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:39 crc kubenswrapper[4698]: E0224 10:16:39.992422 4698 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18972749a2cd277d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18972749a2cd277d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:15.611029373 +0000 UTC m=+0.724643624,LastTimestamp:2026-02-24 10:16:15.717761499 +0000 UTC m=+0.831375740,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:39 crc kubenswrapper[4698]: E0224 10:16:39.998743 4698 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18972749a2cc4852\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18972749a2cc4852 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:15.610972242 +0000 UTC m=+0.724586493,LastTimestamp:2026-02-24 10:16:15.718557426 +0000 UTC m=+0.832171667,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:40 crc kubenswrapper[4698]: E0224 10:16:40.006860 4698 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18972749a2ccdee9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18972749a2ccdee9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:15.611010793 +0000 UTC m=+0.724625044,LastTimestamp:2026-02-24 10:16:15.718575896 +0000 UTC m=+0.832190137,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:40 crc kubenswrapper[4698]: E0224 10:16:40.013522 4698 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18972749a2cd277d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18972749a2cd277d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:15.611029373 +0000 UTC m=+0.724643624,LastTimestamp:2026-02-24 10:16:15.718586206 +0000 UTC m=+0.832200457,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:40 crc kubenswrapper[4698]: E0224 10:16:40.017967 4698 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18972749a2cc4852\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18972749a2cc4852 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:15.610972242 +0000 UTC m=+0.724586493,LastTimestamp:2026-02-24 10:16:15.718604886 +0000 UTC m=+0.832219147,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:40 crc kubenswrapper[4698]: E0224 10:16:40.025600 4698 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.18972749a2ccdee9\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.18972749a2ccdee9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:15.611010793 +0000 UTC m=+0.724625044,LastTimestamp:2026-02-24 10:16:15.718624726 +0000 UTC m=+0.832238977,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:40 crc kubenswrapper[4698]: E0224 10:16:40.032859 4698 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18972749c07a1a0c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:16.108902924 +0000 UTC m=+1.222517195,LastTimestamp:2026-02-24 10:16:16.108902924 +0000 UTC m=+1.222517195,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:40 crc kubenswrapper[4698]: E0224 10:16:40.040149 4698 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18972749c0fbfe58 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:16.117415512 +0000 UTC m=+1.231029753,LastTimestamp:2026-02-24 10:16:16.117415512 +0000 UTC m=+1.231029753,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:40 crc kubenswrapper[4698]: E0224 10:16:40.045970 4698 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18972749c10d50f2 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:16.11855077 +0000 UTC m=+1.232165041,LastTimestamp:2026-02-24 10:16:16.11855077 +0000 UTC m=+1.232165041,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:40 crc kubenswrapper[4698]: E0224 10:16:40.052318 4698 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.18972749c2ef9874 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:16.150157428 +0000 UTC m=+1.263771679,LastTimestamp:2026-02-24 10:16:16.150157428 +0000 UTC m=+1.263771679,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:40 crc kubenswrapper[4698]: E0224 10:16:40.059030 4698 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.18972749c34bb68d openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:16.156194445 +0000 UTC m=+1.269808686,LastTimestamp:2026-02-24 10:16:16.156194445 +0000 UTC m=+1.269808686,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:40 crc kubenswrapper[4698]: E0224 10:16:40.064102 4698 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18972749e87534f5 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:16.779670773 +0000 UTC m=+1.893285014,LastTimestamp:2026-02-24 10:16:16.779670773 +0000 UTC m=+1.893285014,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:40 crc kubenswrapper[4698]: E0224 10:16:40.068420 4698 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.18972749e87dc173 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:16.780231027 +0000 UTC m=+1.893845278,LastTimestamp:2026-02-24 10:16:16.780231027 +0000 UTC m=+1.893845278,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:40 crc kubenswrapper[4698]: E0224 10:16:40.073481 4698 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18972749e8809b60 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:16.780417888 +0000 UTC m=+1.894032149,LastTimestamp:2026-02-24 10:16:16.780417888 +0000 UTC m=+1.894032149,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:40 crc kubenswrapper[4698]: E0224 10:16:40.077926 4698 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.18972749e88167cb openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:16.780470219 +0000 UTC m=+1.894084460,LastTimestamp:2026-02-24 10:16:16.780470219 +0000 UTC m=+1.894084460,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:40 crc kubenswrapper[4698]: E0224 10:16:40.083178 4698 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18972749e881b84d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:16.780490829 +0000 UTC m=+1.894105070,LastTimestamp:2026-02-24 10:16:16.780490829 +0000 UTC m=+1.894105070,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:40 crc kubenswrapper[4698]: E0224 10:16:40.088155 4698 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18972749e8f8bc8c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:16.7882907 +0000 UTC m=+1.901904941,LastTimestamp:2026-02-24 10:16:16.7882907 +0000 UTC m=+1.901904941,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:40 crc kubenswrapper[4698]: E0224 10:16:40.092552 4698 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18972749e90c81cd openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:16.789586381 +0000 UTC m=+1.903200622,LastTimestamp:2026-02-24 10:16:16.789586381 +0000 UTC m=+1.903200622,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:40 crc kubenswrapper[4698]: E0224 10:16:40.097204 4698 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.18972749e9bd3a4f openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:16.801167951 +0000 UTC m=+1.914782192,LastTimestamp:2026-02-24 10:16:16.801167951 +0000 UTC m=+1.914782192,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:40 crc kubenswrapper[4698]: E0224 10:16:40.104097 4698 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.18972749e9f5df0d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:16.804880141 +0000 UTC m=+1.918494382,LastTimestamp:2026-02-24 10:16:16.804880141 +0000 UTC m=+1.918494382,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:40 crc kubenswrapper[4698]: E0224 10:16:40.114929 4698 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.18972749e9fa325d openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:16.805163613 +0000 UTC m=+1.918777854,LastTimestamp:2026-02-24 10:16:16.805163613 +0000 UTC m=+1.918777854,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:40 crc kubenswrapper[4698]: E0224 10:16:40.121230 4698 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.18972749ea00567b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:16.805566075 +0000 UTC m=+1.919180316,LastTimestamp:2026-02-24 10:16:16.805566075 +0000 UTC m=+1.919180316,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:40 crc kubenswrapper[4698]: E0224 10:16:40.128616 4698 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18972749fe9cf0ae openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:17.151373486 +0000 UTC m=+2.264987727,LastTimestamp:2026-02-24 10:16:17.151373486 +0000 UTC m=+2.264987727,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:40 crc kubenswrapper[4698]: E0224 10:16:40.135403 4698 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897274a001a5648 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:17.176368712 +0000 UTC m=+2.289982973,LastTimestamp:2026-02-24 10:16:17.176368712 +0000 UTC m=+2.289982973,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:40 crc kubenswrapper[4698]: E0224 10:16:40.142789 4698 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897274a002d60d0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:17.177616592 +0000 UTC m=+2.291230843,LastTimestamp:2026-02-24 10:16:17.177616592 +0000 UTC m=+2.291230843,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:40 crc kubenswrapper[4698]: E0224 10:16:40.150062 4698 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897274a0cf84cd0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:17.392241872 +0000 UTC m=+2.505856123,LastTimestamp:2026-02-24 10:16:17.392241872 +0000 UTC m=+2.505856123,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:40 crc kubenswrapper[4698]: E0224 10:16:40.156680 4698 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897274a0db41636 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:17.404548662 +0000 UTC m=+2.518162913,LastTimestamp:2026-02-24 10:16:17.404548662 +0000 UTC m=+2.518162913,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:40 crc kubenswrapper[4698]: E0224 10:16:40.163339 4698 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897274a0dc2494b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:17.405479243 +0000 UTC m=+2.519093474,LastTimestamp:2026-02-24 10:16:17.405479243 +0000 UTC m=+2.519093474,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:40 crc kubenswrapper[4698]: E0224 10:16:40.170547 4698 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897274a1868b720 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:17.584158496 +0000 UTC m=+2.697772757,LastTimestamp:2026-02-24 10:16:17.584158496 +0000 UTC m=+2.697772757,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:40 crc kubenswrapper[4698]: E0224 10:16:40.177055 4698 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897274a19209c55 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:17.596210261 +0000 UTC m=+2.709824522,LastTimestamp:2026-02-24 10:16:17.596210261 +0000 UTC m=+2.709824522,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:40 crc kubenswrapper[4698]: E0224 10:16:40.183927 4698 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897274a1bb46aaf openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:17.639451311 +0000 UTC m=+2.753065592,LastTimestamp:2026-02-24 10:16:17.639451311 +0000 UTC m=+2.753065592,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:40 crc kubenswrapper[4698]: E0224 10:16:40.190738 4698 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897274a1cd616eb openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:17.658435307 +0000 UTC m=+2.772049548,LastTimestamp:2026-02-24 10:16:17.658435307 +0000 UTC m=+2.772049548,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:40 crc kubenswrapper[4698]: E0224 10:16:40.197957 4698 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.1897274a1d09c32e openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:17.661821742 +0000 UTC m=+2.775436003,LastTimestamp:2026-02-24 10:16:17.661821742 +0000 UTC m=+2.775436003,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:40 crc kubenswrapper[4698]: E0224 10:16:40.204560 4698 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1897274a1d56a5fe openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:17.666860542 +0000 UTC m=+2.780474793,LastTimestamp:2026-02-24 10:16:17.666860542 +0000 UTC m=+2.780474793,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:40 crc kubenswrapper[4698]: E0224 10:16:40.210948 4698 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897274a282d9e64 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:17.848720996 +0000 UTC m=+2.962335237,LastTimestamp:2026-02-24 10:16:17.848720996 +0000 UTC m=+2.962335237,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:40 crc kubenswrapper[4698]: E0224 10:16:40.215989 4698 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1897274a28cc52a1 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:17.859121825 +0000 UTC m=+2.972736066,LastTimestamp:2026-02-24 10:16:17.859121825 +0000 UTC m=+2.972736066,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:40 crc kubenswrapper[4698]: E0224 10:16:40.222513 4698 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897274a29359974 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:17.866021236 +0000 UTC m=+2.979635477,LastTimestamp:2026-02-24 10:16:17.866021236 +0000 UTC m=+2.979635477,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:40 crc kubenswrapper[4698]: E0224 10:16:40.229474 4698 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.1897274a29e2a808 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:17.877362696 +0000 UTC m=+2.990976957,LastTimestamp:2026-02-24 10:16:17.877362696 +0000 UTC m=+2.990976957,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:40 crc kubenswrapper[4698]: E0224 10:16:40.236699 4698 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1897274a2a058d6d openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:17.879649645 +0000 UTC m=+2.993263966,LastTimestamp:2026-02-24 10:16:17.879649645 +0000 UTC m=+2.993263966,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:40 crc kubenswrapper[4698]: E0224 10:16:40.243909 4698 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897274a2a0a30a4 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:17.879953572 +0000 UTC m=+2.993567813,LastTimestamp:2026-02-24 10:16:17.879953572 +0000 UTC m=+2.993567813,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:40 crc kubenswrapper[4698]: E0224 10:16:40.249460 4698 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897274a2a29f020 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:17.882034208 +0000 UTC m=+2.995648459,LastTimestamp:2026-02-24 10:16:17.882034208 +0000 UTC m=+2.995648459,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:40 crc kubenswrapper[4698]: E0224 10:16:40.254324 4698 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1897274a2a3e24f9 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:17.883358457 +0000 UTC m=+2.996972698,LastTimestamp:2026-02-24 10:16:17.883358457 +0000 UTC m=+2.996972698,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:40 crc kubenswrapper[4698]: E0224 10:16:40.259034 4698 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.1897274a2b428bba openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:17.900424122 +0000 UTC m=+3.014038373,LastTimestamp:2026-02-24 10:16:17.900424122 +0000 UTC m=+3.014038373,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:40 crc kubenswrapper[4698]: E0224 10:16:40.264628 4698 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897274a343aefff openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:18.050920447 +0000 UTC m=+3.164534678,LastTimestamp:2026-02-24 10:16:18.050920447 +0000 UTC m=+3.164534678,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:40 crc kubenswrapper[4698]: E0224 10:16:40.269881 4698 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1897274a353b090c openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:18.067704076 +0000 UTC m=+3.181318317,LastTimestamp:2026-02-24 10:16:18.067704076 +0000 UTC m=+3.181318317,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:40 crc kubenswrapper[4698]: E0224 10:16:40.274917 4698 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897274a353d5985 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:18.067855749 +0000 UTC m=+3.181469980,LastTimestamp:2026-02-24 10:16:18.067855749 +0000 UTC m=+3.181469980,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:40 crc kubenswrapper[4698]: E0224 10:16:40.279348 4698 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897274a354d4bd5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:18.068900821 +0000 UTC m=+3.182515062,LastTimestamp:2026-02-24 10:16:18.068900821 +0000 UTC m=+3.182515062,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:40 crc kubenswrapper[4698]: E0224 10:16:40.286010 4698 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1897274a369588b6 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:18.090412214 +0000 UTC m=+3.204026455,LastTimestamp:2026-02-24 10:16:18.090412214 +0000 UTC m=+3.204026455,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:40 crc kubenswrapper[4698]: E0224 10:16:40.292042 4698 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1897274a36afa97d openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:18.092124541 +0000 UTC m=+3.205738772,LastTimestamp:2026-02-24 10:16:18.092124541 +0000 UTC m=+3.205738772,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:40 crc kubenswrapper[4698]: E0224 10:16:40.297389 4698 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897274a40ad51f7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:18.259743223 +0000 UTC m=+3.373357464,LastTimestamp:2026-02-24 10:16:18.259743223 +0000 UTC m=+3.373357464,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:40 crc kubenswrapper[4698]: E0224 10:16:40.300876 4698 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1897274a40ece264 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:18.263908964 +0000 UTC m=+3.377523195,LastTimestamp:2026-02-24 10:16:18.263908964 +0000 UTC m=+3.377523195,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:40 crc kubenswrapper[4698]: E0224 10:16:40.307107 4698 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897274a41678934 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:18.27194706 +0000 UTC m=+3.385561301,LastTimestamp:2026-02-24 10:16:18.27194706 +0000 UTC m=+3.385561301,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:40 crc kubenswrapper[4698]: E0224 10:16:40.313853 4698 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897274a4176b668 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:18.272941672 +0000 UTC m=+3.386555913,LastTimestamp:2026-02-24 10:16:18.272941672 +0000 UTC m=+3.386555913,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:40 crc kubenswrapper[4698]: E0224 10:16:40.321116 4698 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.1897274a423f16e2 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:18.28607357 +0000 UTC m=+3.399687811,LastTimestamp:2026-02-24 10:16:18.28607357 +0000 UTC m=+3.399687811,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:40 crc kubenswrapper[4698]: E0224 10:16:40.324791 4698 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897274a48ab3818 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:18.393823256 +0000 UTC m=+3.507437497,LastTimestamp:2026-02-24 10:16:18.393823256 +0000 UTC m=+3.507437497,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:40 crc kubenswrapper[4698]: E0224 10:16:40.331552 4698 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897274a4c0ca92a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:18.450540842 +0000 UTC m=+3.564155123,LastTimestamp:2026-02-24 10:16:18.450540842 +0000 UTC m=+3.564155123,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:40 crc kubenswrapper[4698]: E0224 10:16:40.337531 4698 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897274a4ca7cc10 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:18.460707856 +0000 UTC m=+3.574322137,LastTimestamp:2026-02-24 10:16:18.460707856 +0000 UTC m=+3.574322137,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:40 crc kubenswrapper[4698]: E0224 10:16:40.341400 4698 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897274a4cbb30f0 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:18.461978864 +0000 UTC m=+3.575593105,LastTimestamp:2026-02-24 10:16:18.461978864 +0000 UTC m=+3.575593105,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:40 crc kubenswrapper[4698]: E0224 10:16:40.345985 4698 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897274a5768dccc openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:18.641132748 +0000 UTC m=+3.754746989,LastTimestamp:2026-02-24 10:16:18.641132748 +0000 UTC m=+3.754746989,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:40 crc kubenswrapper[4698]: E0224 10:16:40.350063 4698 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897274a5821427c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:18.653217404 +0000 UTC m=+3.766831645,LastTimestamp:2026-02-24 10:16:18.653217404 +0000 UTC m=+3.766831645,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:40 crc kubenswrapper[4698]: E0224 10:16:40.355733 4698 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897274a598e72bf openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:18.677150399 +0000 UTC m=+3.790764640,LastTimestamp:2026-02-24 10:16:18.677150399 +0000 UTC m=+3.790764640,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:40 crc kubenswrapper[4698]: E0224 10:16:40.361781 4698 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897274a64890838 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:18.861344824 +0000 UTC m=+3.974959065,LastTimestamp:2026-02-24 10:16:18.861344824 +0000 UTC m=+3.974959065,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:40 crc kubenswrapper[4698]: E0224 10:16:40.365941 4698 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897274a654af537 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:18.874053943 +0000 UTC m=+3.987668184,LastTimestamp:2026-02-24 10:16:18.874053943 +0000 UTC m=+3.987668184,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:40 crc kubenswrapper[4698]: E0224 10:16:40.369683 4698 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Feb 24 10:16:40 crc kubenswrapper[4698]: &Event{ObjectMeta:{kube-apiserver-crc.1897274a6de676cd openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:6443/livez": dial tcp 192.168.126.11:6443: connect: connection refused Feb 24 10:16:40 crc kubenswrapper[4698]: body: Feb 24 10:16:40 crc kubenswrapper[4698]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:19.018462925 +0000 UTC m=+4.132077206,LastTimestamp:2026-02-24 10:16:19.018462925 +0000 UTC m=+4.132077206,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 24 10:16:40 crc kubenswrapper[4698]: > Feb 24 10:16:40 crc kubenswrapper[4698]: E0224 10:16:40.374202 4698 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897274a6de76c6e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:6443/livez\": dial tcp 192.168.126.11:6443: connect: connection refused,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:19.018525806 +0000 UTC m=+4.132140077,LastTimestamp:2026-02-24 10:16:19.018525806 +0000 UTC m=+4.132140077,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:40 crc kubenswrapper[4698]: E0224 10:16:40.378961 4698 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897274a9538a180 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:19.678159232 +0000 UTC m=+4.791773513,LastTimestamp:2026-02-24 10:16:19.678159232 +0000 UTC m=+4.791773513,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:40 crc kubenswrapper[4698]: E0224 10:16:40.384212 4698 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.1897274a4cbb30f0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897274a4cbb30f0 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:18.461978864 +0000 UTC m=+3.575593105,LastTimestamp:2026-02-24 10:16:19.690737738 +0000 UTC m=+4.804351989,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:40 crc kubenswrapper[4698]: E0224 10:16:40.388372 4698 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.1897274a5768dccc\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897274a5768dccc openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:18.641132748 +0000 UTC m=+3.754746989,LastTimestamp:2026-02-24 10:16:19.911056726 +0000 UTC m=+5.024670977,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:40 crc kubenswrapper[4698]: E0224 10:16:40.391608 4698 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897274aa329c0c3 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:19.912065219 +0000 UTC m=+5.025679460,LastTimestamp:2026-02-24 10:16:19.912065219 +0000 UTC m=+5.025679460,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:40 crc kubenswrapper[4698]: E0224 10:16:40.397352 4698 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.1897274a5821427c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897274a5821427c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:18.653217404 +0000 UTC m=+3.766831645,LastTimestamp:2026-02-24 10:16:19.92900198 +0000 UTC m=+5.042616221,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:40 crc kubenswrapper[4698]: E0224 10:16:40.400542 4698 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897274aa43c1ec0 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:19.930046144 +0000 UTC m=+5.043660375,LastTimestamp:2026-02-24 10:16:19.930046144 +0000 UTC m=+5.043660375,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:40 crc kubenswrapper[4698]: E0224 10:16:40.403945 4698 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897274aa4506d39 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:19.931376953 +0000 UTC m=+5.044991194,LastTimestamp:2026-02-24 10:16:19.931376953 +0000 UTC m=+5.044991194,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:40 crc kubenswrapper[4698]: E0224 10:16:40.407572 4698 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897274ab2593677 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:20.166833783 +0000 UTC m=+5.280448054,LastTimestamp:2026-02-24 10:16:20.166833783 +0000 UTC m=+5.280448054,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:40 crc kubenswrapper[4698]: E0224 10:16:40.412220 4698 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897274ab3328f38 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:20.181077816 +0000 UTC m=+5.294692097,LastTimestamp:2026-02-24 10:16:20.181077816 +0000 UTC m=+5.294692097,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:40 crc kubenswrapper[4698]: E0224 10:16:40.416115 4698 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897274ab34dee7b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:20.182871675 +0000 UTC m=+5.296485956,LastTimestamp:2026-02-24 10:16:20.182871675 +0000 UTC m=+5.296485956,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:40 crc kubenswrapper[4698]: E0224 10:16:40.420481 4698 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897274ac215efb3 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:20.430860211 +0000 UTC m=+5.544474452,LastTimestamp:2026-02-24 10:16:20.430860211 +0000 UTC m=+5.544474452,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:40 crc kubenswrapper[4698]: E0224 10:16:40.424455 4698 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897274ac30669e0 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:20.446620128 +0000 UTC m=+5.560234369,LastTimestamp:2026-02-24 10:16:20.446620128 +0000 UTC m=+5.560234369,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:40 crc kubenswrapper[4698]: E0224 10:16:40.429223 4698 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897274ac3132700 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:20.447454976 +0000 UTC m=+5.561069217,LastTimestamp:2026-02-24 10:16:20.447454976 +0000 UTC m=+5.561069217,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:40 crc kubenswrapper[4698]: E0224 10:16:40.436056 4698 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897274ad296a819 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:20.707731481 +0000 UTC m=+5.821345742,LastTimestamp:2026-02-24 10:16:20.707731481 +0000 UTC m=+5.821345742,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:40 crc kubenswrapper[4698]: E0224 10:16:40.439861 4698 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897274ad36593da openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:20.72129225 +0000 UTC m=+5.834906531,LastTimestamp:2026-02-24 10:16:20.72129225 +0000 UTC m=+5.834906531,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:40 crc kubenswrapper[4698]: E0224 10:16:40.445860 4698 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897274ad37b931b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:20.722733851 +0000 UTC m=+5.836348102,LastTimestamp:2026-02-24 10:16:20.722733851 +0000 UTC m=+5.836348102,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:40 crc kubenswrapper[4698]: E0224 10:16:40.452104 4698 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897274ae59ea32d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:21.027021613 +0000 UTC m=+6.140635894,LastTimestamp:2026-02-24 10:16:21.027021613 +0000 UTC m=+6.140635894,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:40 crc kubenswrapper[4698]: E0224 10:16:40.458634 4698 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897274ae6ba8ce3 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:21.045628131 +0000 UTC m=+6.159242412,LastTimestamp:2026-02-24 10:16:21.045628131 +0000 UTC m=+6.159242412,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:40 crc kubenswrapper[4698]: E0224 10:16:40.468665 4698 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Feb 24 10:16:40 crc kubenswrapper[4698]: &Event{ObjectMeta:{kube-controller-manager-crc.1897274c36cc8b5b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Feb 24 10:16:40 crc kubenswrapper[4698]: body: Feb 24 10:16:40 crc kubenswrapper[4698]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:26.683951963 +0000 UTC m=+11.797566214,LastTimestamp:2026-02-24 10:16:26.683951963 +0000 UTC m=+11.797566214,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 24 10:16:40 crc kubenswrapper[4698]: > Feb 24 10:16:40 crc kubenswrapper[4698]: E0224 10:16:40.476651 4698 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897274c36cd777e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:26.684012414 +0000 UTC m=+11.797626665,LastTimestamp:2026-02-24 10:16:26.684012414 +0000 UTC m=+11.797626665,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:40 crc kubenswrapper[4698]: E0224 10:16:40.486703 4698 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Feb 24 10:16:40 crc kubenswrapper[4698]: &Event{ObjectMeta:{kube-apiserver-crc.1897274d3b2fd77c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Feb 24 10:16:40 crc kubenswrapper[4698]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 24 10:16:40 crc kubenswrapper[4698]: Feb 24 10:16:40 crc kubenswrapper[4698]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:31.052535676 +0000 UTC m=+16.166149957,LastTimestamp:2026-02-24 10:16:31.052535676 +0000 UTC m=+16.166149957,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 24 10:16:40 crc kubenswrapper[4698]: > Feb 24 10:16:40 crc kubenswrapper[4698]: E0224 10:16:40.491630 4698 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897274d3b312035 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:31.052619829 +0000 UTC m=+16.166234110,LastTimestamp:2026-02-24 10:16:31.052619829 +0000 UTC m=+16.166234110,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:40 crc kubenswrapper[4698]: E0224 10:16:40.497737 4698 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Feb 24 10:16:40 crc kubenswrapper[4698]: &Event{ObjectMeta:{kube-controller-manager-crc.1897274e8ae0936f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Feb 24 10:16:40 crc kubenswrapper[4698]: body: Feb 24 10:16:40 crc kubenswrapper[4698]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:36.684485487 +0000 UTC m=+21.798099768,LastTimestamp:2026-02-24 10:16:36.684485487 +0000 UTC m=+21.798099768,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 24 10:16:40 crc kubenswrapper[4698]: > Feb 24 10:16:40 crc kubenswrapper[4698]: E0224 10:16:40.502797 4698 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897274e8ae1ea74 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:36.6845733 +0000 UTC m=+21.798187571,LastTimestamp:2026-02-24 10:16:36.6845733 +0000 UTC m=+21.798187571,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:40 crc kubenswrapper[4698]: I0224 10:16:40.557400 4698 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 24 10:16:41 crc kubenswrapper[4698]: I0224 10:16:41.561475 4698 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 24 10:16:42 crc kubenswrapper[4698]: I0224 10:16:42.560882 4698 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 24 10:16:43 crc kubenswrapper[4698]: I0224 10:16:43.560381 4698 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 24 10:16:44 crc kubenswrapper[4698]: I0224 10:16:44.473013 4698 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:16:44 crc kubenswrapper[4698]: I0224 10:16:44.475458 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:16:44 crc kubenswrapper[4698]: I0224 10:16:44.475624 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:16:44 crc kubenswrapper[4698]: I0224 10:16:44.475708 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:16:44 crc kubenswrapper[4698]: I0224 10:16:44.475813 4698 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 24 10:16:44 crc kubenswrapper[4698]: E0224 10:16:44.479769 4698 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Feb 24 10:16:44 crc kubenswrapper[4698]: E0224 10:16:44.480065 4698 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Feb 24 10:16:44 crc kubenswrapper[4698]: I0224 10:16:44.560298 4698 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 24 10:16:44 crc kubenswrapper[4698]: W0224 10:16:44.964019 4698 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Feb 24 10:16:44 crc kubenswrapper[4698]: E0224 10:16:44.964075 4698 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Feb 24 10:16:45 crc kubenswrapper[4698]: I0224 10:16:45.560490 4698 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 24 10:16:45 crc kubenswrapper[4698]: I0224 10:16:45.614857 4698 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:16:45 crc kubenswrapper[4698]: I0224 10:16:45.617150 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:16:45 crc kubenswrapper[4698]: I0224 10:16:45.617248 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:16:45 crc kubenswrapper[4698]: I0224 10:16:45.617312 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:16:45 crc kubenswrapper[4698]: I0224 10:16:45.618541 4698 scope.go:117] "RemoveContainer" containerID="7722405f18fa1aa0959f68b39e76fac95bcdac9aaf8ee95837e12a4e0953c45e" Feb 24 10:16:45 crc kubenswrapper[4698]: E0224 10:16:45.714114 4698 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 24 10:16:46 crc kubenswrapper[4698]: I0224 10:16:46.562194 4698 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 24 10:16:46 crc kubenswrapper[4698]: I0224 10:16:46.683963 4698 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 24 10:16:46 crc kubenswrapper[4698]: I0224 10:16:46.684016 4698 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 24 10:16:46 crc kubenswrapper[4698]: I0224 10:16:46.684077 4698 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 24 10:16:46 crc kubenswrapper[4698]: I0224 10:16:46.684208 4698 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:16:46 crc kubenswrapper[4698]: I0224 10:16:46.685291 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:16:46 crc kubenswrapper[4698]: I0224 10:16:46.685362 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:16:46 crc kubenswrapper[4698]: I0224 10:16:46.685392 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:16:46 crc kubenswrapper[4698]: I0224 10:16:46.686052 4698 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"686acda68f64175c520efc4054df6bcfd32b2c98a3d8134d32e252d265520338"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Feb 24 10:16:46 crc kubenswrapper[4698]: I0224 10:16:46.686346 4698 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://686acda68f64175c520efc4054df6bcfd32b2c98a3d8134d32e252d265520338" gracePeriod=30 Feb 24 10:16:46 crc kubenswrapper[4698]: E0224 10:16:46.689720 4698 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.1897274e8ae0936f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Feb 24 10:16:46 crc kubenswrapper[4698]: &Event{ObjectMeta:{kube-controller-manager-crc.1897274e8ae0936f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Feb 24 10:16:46 crc kubenswrapper[4698]: body: Feb 24 10:16:46 crc kubenswrapper[4698]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:36.684485487 +0000 UTC m=+21.798099768,LastTimestamp:2026-02-24 10:16:46.684001537 +0000 UTC m=+31.797615778,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 24 10:16:46 crc kubenswrapper[4698]: > Feb 24 10:16:46 crc kubenswrapper[4698]: E0224 10:16:46.694115 4698 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.1897274e8ae1ea74\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897274e8ae1ea74 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:36.6845733 +0000 UTC m=+21.798187571,LastTimestamp:2026-02-24 10:16:46.684048038 +0000 UTC m=+31.797662299,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:46 crc kubenswrapper[4698]: E0224 10:16:46.698297 4698 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18972750df086845 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:46.686316613 +0000 UTC m=+31.799930894,LastTimestamp:2026-02-24 10:16:46.686316613 +0000 UTC m=+31.799930894,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:46 crc kubenswrapper[4698]: I0224 10:16:46.794475 4698 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Feb 24 10:16:46 crc kubenswrapper[4698]: I0224 10:16:46.795447 4698 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Feb 24 10:16:46 crc kubenswrapper[4698]: I0224 10:16:46.799151 4698 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="711fa37ba0589a56b7cfdbca923b12c18a2088ff53b10b6395e1b4f866c198b1" exitCode=255 Feb 24 10:16:46 crc kubenswrapper[4698]: I0224 10:16:46.799214 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"711fa37ba0589a56b7cfdbca923b12c18a2088ff53b10b6395e1b4f866c198b1"} Feb 24 10:16:46 crc kubenswrapper[4698]: I0224 10:16:46.799302 4698 scope.go:117] "RemoveContainer" containerID="7722405f18fa1aa0959f68b39e76fac95bcdac9aaf8ee95837e12a4e0953c45e" Feb 24 10:16:46 crc kubenswrapper[4698]: I0224 10:16:46.799519 4698 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:16:46 crc kubenswrapper[4698]: I0224 10:16:46.801694 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:16:46 crc kubenswrapper[4698]: I0224 10:16:46.801747 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:16:46 crc kubenswrapper[4698]: I0224 10:16:46.801770 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:16:46 crc kubenswrapper[4698]: I0224 10:16:46.802741 4698 scope.go:117] "RemoveContainer" containerID="711fa37ba0589a56b7cfdbca923b12c18a2088ff53b10b6395e1b4f866c198b1" Feb 24 10:16:46 crc kubenswrapper[4698]: E0224 10:16:46.803169 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 24 10:16:46 crc kubenswrapper[4698]: E0224 10:16:46.948100 4698 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.18972749e90c81cd\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18972749e90c81cd openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:16.789586381 +0000 UTC m=+1.903200622,LastTimestamp:2026-02-24 10:16:46.940150432 +0000 UTC m=+32.053764713,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:46 crc kubenswrapper[4698]: W0224 10:16:46.962243 4698 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Feb 24 10:16:46 crc kubenswrapper[4698]: E0224 10:16:46.962352 4698 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Feb 24 10:16:47 crc kubenswrapper[4698]: E0224 10:16:47.205923 4698 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.18972749fe9cf0ae\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.18972749fe9cf0ae openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:17.151373486 +0000 UTC m=+2.264987727,LastTimestamp:2026-02-24 10:16:47.194584205 +0000 UTC m=+32.308198436,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:47 crc kubenswrapper[4698]: E0224 10:16:47.219729 4698 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.1897274a001a5648\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897274a001a5648 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:17.176368712 +0000 UTC m=+2.289982973,LastTimestamp:2026-02-24 10:16:47.212676242 +0000 UTC m=+32.326290513,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:47 crc kubenswrapper[4698]: I0224 10:16:47.562011 4698 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 24 10:16:47 crc kubenswrapper[4698]: I0224 10:16:47.804736 4698 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Feb 24 10:16:47 crc kubenswrapper[4698]: I0224 10:16:47.805170 4698 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="686acda68f64175c520efc4054df6bcfd32b2c98a3d8134d32e252d265520338" exitCode=255 Feb 24 10:16:47 crc kubenswrapper[4698]: I0224 10:16:47.805256 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"686acda68f64175c520efc4054df6bcfd32b2c98a3d8134d32e252d265520338"} Feb 24 10:16:47 crc kubenswrapper[4698]: I0224 10:16:47.805320 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"23d201f106cf9fbd3bd2821755ea1fd87709b24155eebfab4f687defd0fd60bc"} Feb 24 10:16:47 crc kubenswrapper[4698]: I0224 10:16:47.805443 4698 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:16:47 crc kubenswrapper[4698]: I0224 10:16:47.806527 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:16:47 crc kubenswrapper[4698]: I0224 10:16:47.806570 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:16:47 crc kubenswrapper[4698]: I0224 10:16:47.806586 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:16:47 crc kubenswrapper[4698]: I0224 10:16:47.807512 4698 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Feb 24 10:16:48 crc kubenswrapper[4698]: I0224 10:16:48.560103 4698 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 24 10:16:48 crc kubenswrapper[4698]: I0224 10:16:48.573251 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 10:16:48 crc kubenswrapper[4698]: I0224 10:16:48.573576 4698 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:16:48 crc kubenswrapper[4698]: I0224 10:16:48.575143 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:16:48 crc kubenswrapper[4698]: I0224 10:16:48.575192 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:16:48 crc kubenswrapper[4698]: I0224 10:16:48.575213 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:16:48 crc kubenswrapper[4698]: I0224 10:16:48.576017 4698 scope.go:117] "RemoveContainer" containerID="711fa37ba0589a56b7cfdbca923b12c18a2088ff53b10b6395e1b4f866c198b1" Feb 24 10:16:48 crc kubenswrapper[4698]: E0224 10:16:48.576380 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 24 10:16:49 crc kubenswrapper[4698]: I0224 10:16:49.558923 4698 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 24 10:16:50 crc kubenswrapper[4698]: I0224 10:16:50.555407 4698 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 24 10:16:51 crc kubenswrapper[4698]: I0224 10:16:51.480436 4698 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:16:51 crc kubenswrapper[4698]: I0224 10:16:51.482104 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:16:51 crc kubenswrapper[4698]: I0224 10:16:51.482142 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:16:51 crc kubenswrapper[4698]: I0224 10:16:51.482152 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:16:51 crc kubenswrapper[4698]: I0224 10:16:51.482178 4698 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 24 10:16:51 crc kubenswrapper[4698]: E0224 10:16:51.488516 4698 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Feb 24 10:16:51 crc kubenswrapper[4698]: E0224 10:16:51.488707 4698 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Feb 24 10:16:51 crc kubenswrapper[4698]: I0224 10:16:51.559965 4698 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 24 10:16:52 crc kubenswrapper[4698]: I0224 10:16:52.339040 4698 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 10:16:52 crc kubenswrapper[4698]: I0224 10:16:52.339228 4698 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:16:52 crc kubenswrapper[4698]: I0224 10:16:52.340355 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:16:52 crc kubenswrapper[4698]: I0224 10:16:52.340393 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:16:52 crc kubenswrapper[4698]: I0224 10:16:52.340407 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:16:52 crc kubenswrapper[4698]: I0224 10:16:52.340858 4698 scope.go:117] "RemoveContainer" containerID="711fa37ba0589a56b7cfdbca923b12c18a2088ff53b10b6395e1b4f866c198b1" Feb 24 10:16:52 crc kubenswrapper[4698]: E0224 10:16:52.341004 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 24 10:16:52 crc kubenswrapper[4698]: I0224 10:16:52.462335 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 24 10:16:52 crc kubenswrapper[4698]: I0224 10:16:52.462544 4698 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:16:52 crc kubenswrapper[4698]: I0224 10:16:52.464003 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:16:52 crc kubenswrapper[4698]: I0224 10:16:52.464064 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:16:52 crc kubenswrapper[4698]: I0224 10:16:52.464082 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:16:52 crc kubenswrapper[4698]: I0224 10:16:52.557200 4698 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 24 10:16:53 crc kubenswrapper[4698]: I0224 10:16:53.557755 4698 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 24 10:16:53 crc kubenswrapper[4698]: I0224 10:16:53.682929 4698 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 24 10:16:53 crc kubenswrapper[4698]: I0224 10:16:53.683127 4698 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:16:53 crc kubenswrapper[4698]: I0224 10:16:53.684600 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:16:53 crc kubenswrapper[4698]: I0224 10:16:53.684666 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:16:53 crc kubenswrapper[4698]: I0224 10:16:53.684689 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:16:54 crc kubenswrapper[4698]: I0224 10:16:54.557774 4698 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 24 10:16:55 crc kubenswrapper[4698]: W0224 10:16:55.042376 4698 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Feb 24 10:16:55 crc kubenswrapper[4698]: E0224 10:16:55.042441 4698 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Feb 24 10:16:55 crc kubenswrapper[4698]: I0224 10:16:55.559990 4698 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 24 10:16:55 crc kubenswrapper[4698]: E0224 10:16:55.714398 4698 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 24 10:16:56 crc kubenswrapper[4698]: I0224 10:16:56.561121 4698 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 24 10:16:56 crc kubenswrapper[4698]: I0224 10:16:56.683682 4698 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 24 10:16:56 crc kubenswrapper[4698]: I0224 10:16:56.683998 4698 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 24 10:16:56 crc kubenswrapper[4698]: E0224 10:16:56.691404 4698 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.1897274e8ae0936f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Feb 24 10:16:56 crc kubenswrapper[4698]: &Event{ObjectMeta:{kube-controller-manager-crc.1897274e8ae0936f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Feb 24 10:16:56 crc kubenswrapper[4698]: body: Feb 24 10:16:56 crc kubenswrapper[4698]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:36.684485487 +0000 UTC m=+21.798099768,LastTimestamp:2026-02-24 10:16:56.683958527 +0000 UTC m=+41.797572788,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 24 10:16:56 crc kubenswrapper[4698]: > Feb 24 10:16:56 crc kubenswrapper[4698]: E0224 10:16:56.698684 4698 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.1897274e8ae1ea74\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897274e8ae1ea74 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:16:36.6845733 +0000 UTC m=+21.798187571,LastTimestamp:2026-02-24 10:16:56.684040479 +0000 UTC m=+41.797654720,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:16:56 crc kubenswrapper[4698]: W0224 10:16:56.827421 4698 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Feb 24 10:16:56 crc kubenswrapper[4698]: E0224 10:16:56.827467 4698 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Feb 24 10:16:57 crc kubenswrapper[4698]: I0224 10:16:57.557850 4698 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 24 10:16:58 crc kubenswrapper[4698]: I0224 10:16:58.488702 4698 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:16:58 crc kubenswrapper[4698]: I0224 10:16:58.489968 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:16:58 crc kubenswrapper[4698]: I0224 10:16:58.490005 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:16:58 crc kubenswrapper[4698]: I0224 10:16:58.490016 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:16:58 crc kubenswrapper[4698]: I0224 10:16:58.490041 4698 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 24 10:16:58 crc kubenswrapper[4698]: E0224 10:16:58.496303 4698 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Feb 24 10:16:58 crc kubenswrapper[4698]: E0224 10:16:58.496426 4698 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Feb 24 10:16:58 crc kubenswrapper[4698]: I0224 10:16:58.553820 4698 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 24 10:16:59 crc kubenswrapper[4698]: I0224 10:16:59.556195 4698 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 24 10:17:00 crc kubenswrapper[4698]: I0224 10:17:00.557936 4698 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 24 10:17:00 crc kubenswrapper[4698]: W0224 10:17:00.908648 4698 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Feb 24 10:17:00 crc kubenswrapper[4698]: E0224 10:17:00.908733 4698 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Feb 24 10:17:01 crc kubenswrapper[4698]: I0224 10:17:01.559401 4698 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 24 10:17:02 crc kubenswrapper[4698]: I0224 10:17:02.559078 4698 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 24 10:17:03 crc kubenswrapper[4698]: I0224 10:17:03.559843 4698 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 24 10:17:03 crc kubenswrapper[4698]: I0224 10:17:03.691162 4698 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 24 10:17:03 crc kubenswrapper[4698]: I0224 10:17:03.691380 4698 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:17:03 crc kubenswrapper[4698]: I0224 10:17:03.693405 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:03 crc kubenswrapper[4698]: I0224 10:17:03.693444 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:03 crc kubenswrapper[4698]: I0224 10:17:03.693456 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:03 crc kubenswrapper[4698]: I0224 10:17:03.696755 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 24 10:17:03 crc kubenswrapper[4698]: I0224 10:17:03.850198 4698 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:17:03 crc kubenswrapper[4698]: I0224 10:17:03.851322 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:03 crc kubenswrapper[4698]: I0224 10:17:03.851360 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:03 crc kubenswrapper[4698]: I0224 10:17:03.851371 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:04 crc kubenswrapper[4698]: I0224 10:17:04.560764 4698 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 24 10:17:05 crc kubenswrapper[4698]: I0224 10:17:05.496594 4698 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:17:05 crc kubenswrapper[4698]: I0224 10:17:05.498533 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:05 crc kubenswrapper[4698]: I0224 10:17:05.498641 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:05 crc kubenswrapper[4698]: I0224 10:17:05.498670 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:05 crc kubenswrapper[4698]: I0224 10:17:05.498718 4698 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 24 10:17:05 crc kubenswrapper[4698]: E0224 10:17:05.503227 4698 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Feb 24 10:17:05 crc kubenswrapper[4698]: E0224 10:17:05.503248 4698 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Feb 24 10:17:05 crc kubenswrapper[4698]: I0224 10:17:05.560053 4698 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 24 10:17:05 crc kubenswrapper[4698]: E0224 10:17:05.714622 4698 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 24 10:17:06 crc kubenswrapper[4698]: I0224 10:17:06.560412 4698 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 24 10:17:06 crc kubenswrapper[4698]: I0224 10:17:06.990597 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 24 10:17:06 crc kubenswrapper[4698]: I0224 10:17:06.990812 4698 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:17:06 crc kubenswrapper[4698]: I0224 10:17:06.992211 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:06 crc kubenswrapper[4698]: I0224 10:17:06.992297 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:06 crc kubenswrapper[4698]: I0224 10:17:06.992325 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:07 crc kubenswrapper[4698]: I0224 10:17:07.560064 4698 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 24 10:17:07 crc kubenswrapper[4698]: I0224 10:17:07.614144 4698 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:17:07 crc kubenswrapper[4698]: I0224 10:17:07.615936 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:07 crc kubenswrapper[4698]: I0224 10:17:07.616000 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:07 crc kubenswrapper[4698]: I0224 10:17:07.616021 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:07 crc kubenswrapper[4698]: I0224 10:17:07.616878 4698 scope.go:117] "RemoveContainer" containerID="711fa37ba0589a56b7cfdbca923b12c18a2088ff53b10b6395e1b4f866c198b1" Feb 24 10:17:07 crc kubenswrapper[4698]: I0224 10:17:07.867717 4698 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Feb 24 10:17:07 crc kubenswrapper[4698]: I0224 10:17:07.870515 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"64b39341e105fbe8aa9dc4c108f6ee8a2bff33568a205e32e639b8382ab2ccb2"} Feb 24 10:17:08 crc kubenswrapper[4698]: I0224 10:17:08.560026 4698 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 24 10:17:08 crc kubenswrapper[4698]: I0224 10:17:08.875607 4698 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Feb 24 10:17:08 crc kubenswrapper[4698]: I0224 10:17:08.876470 4698 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Feb 24 10:17:08 crc kubenswrapper[4698]: I0224 10:17:08.878703 4698 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="64b39341e105fbe8aa9dc4c108f6ee8a2bff33568a205e32e639b8382ab2ccb2" exitCode=255 Feb 24 10:17:08 crc kubenswrapper[4698]: I0224 10:17:08.878754 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"64b39341e105fbe8aa9dc4c108f6ee8a2bff33568a205e32e639b8382ab2ccb2"} Feb 24 10:17:08 crc kubenswrapper[4698]: I0224 10:17:08.878800 4698 scope.go:117] "RemoveContainer" containerID="711fa37ba0589a56b7cfdbca923b12c18a2088ff53b10b6395e1b4f866c198b1" Feb 24 10:17:08 crc kubenswrapper[4698]: I0224 10:17:08.878839 4698 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:17:08 crc kubenswrapper[4698]: I0224 10:17:08.879970 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:08 crc kubenswrapper[4698]: I0224 10:17:08.880032 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:08 crc kubenswrapper[4698]: I0224 10:17:08.880051 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:08 crc kubenswrapper[4698]: I0224 10:17:08.881050 4698 scope.go:117] "RemoveContainer" containerID="64b39341e105fbe8aa9dc4c108f6ee8a2bff33568a205e32e639b8382ab2ccb2" Feb 24 10:17:08 crc kubenswrapper[4698]: E0224 10:17:08.881569 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 24 10:17:09 crc kubenswrapper[4698]: I0224 10:17:09.560219 4698 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 24 10:17:09 crc kubenswrapper[4698]: I0224 10:17:09.884618 4698 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Feb 24 10:17:09 crc kubenswrapper[4698]: I0224 10:17:09.887359 4698 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:17:09 crc kubenswrapper[4698]: I0224 10:17:09.888620 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:09 crc kubenswrapper[4698]: I0224 10:17:09.888675 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:09 crc kubenswrapper[4698]: I0224 10:17:09.888693 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:09 crc kubenswrapper[4698]: I0224 10:17:09.889621 4698 scope.go:117] "RemoveContainer" containerID="64b39341e105fbe8aa9dc4c108f6ee8a2bff33568a205e32e639b8382ab2ccb2" Feb 24 10:17:09 crc kubenswrapper[4698]: E0224 10:17:09.889929 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 24 10:17:10 crc kubenswrapper[4698]: I0224 10:17:10.565046 4698 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 24 10:17:11 crc kubenswrapper[4698]: I0224 10:17:11.563175 4698 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 24 10:17:12 crc kubenswrapper[4698]: W0224 10:17:12.329412 4698 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Feb 24 10:17:12 crc kubenswrapper[4698]: E0224 10:17:12.329475 4698 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Feb 24 10:17:12 crc kubenswrapper[4698]: I0224 10:17:12.338957 4698 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 10:17:12 crc kubenswrapper[4698]: I0224 10:17:12.339154 4698 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:17:12 crc kubenswrapper[4698]: I0224 10:17:12.340724 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:12 crc kubenswrapper[4698]: I0224 10:17:12.340798 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:12 crc kubenswrapper[4698]: I0224 10:17:12.340823 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:12 crc kubenswrapper[4698]: I0224 10:17:12.341814 4698 scope.go:117] "RemoveContainer" containerID="64b39341e105fbe8aa9dc4c108f6ee8a2bff33568a205e32e639b8382ab2ccb2" Feb 24 10:17:12 crc kubenswrapper[4698]: E0224 10:17:12.342112 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 24 10:17:12 crc kubenswrapper[4698]: I0224 10:17:12.504362 4698 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:17:12 crc kubenswrapper[4698]: I0224 10:17:12.506186 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:12 crc kubenswrapper[4698]: I0224 10:17:12.506257 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:12 crc kubenswrapper[4698]: I0224 10:17:12.506317 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:12 crc kubenswrapper[4698]: I0224 10:17:12.506362 4698 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 24 10:17:12 crc kubenswrapper[4698]: E0224 10:17:12.512124 4698 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Feb 24 10:17:12 crc kubenswrapper[4698]: E0224 10:17:12.512495 4698 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Feb 24 10:17:12 crc kubenswrapper[4698]: I0224 10:17:12.559680 4698 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 24 10:17:13 crc kubenswrapper[4698]: I0224 10:17:13.561083 4698 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 24 10:17:14 crc kubenswrapper[4698]: I0224 10:17:14.561332 4698 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 24 10:17:15 crc kubenswrapper[4698]: I0224 10:17:15.560131 4698 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 24 10:17:15 crc kubenswrapper[4698]: E0224 10:17:15.715005 4698 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 24 10:17:16 crc kubenswrapper[4698]: I0224 10:17:16.561109 4698 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 24 10:17:17 crc kubenswrapper[4698]: I0224 10:17:17.559521 4698 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 24 10:17:18 crc kubenswrapper[4698]: I0224 10:17:18.557817 4698 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 24 10:17:18 crc kubenswrapper[4698]: I0224 10:17:18.573423 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 10:17:18 crc kubenswrapper[4698]: I0224 10:17:18.573876 4698 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:17:18 crc kubenswrapper[4698]: I0224 10:17:18.575672 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:18 crc kubenswrapper[4698]: I0224 10:17:18.575769 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:18 crc kubenswrapper[4698]: I0224 10:17:18.575834 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:18 crc kubenswrapper[4698]: I0224 10:17:18.577190 4698 scope.go:117] "RemoveContainer" containerID="64b39341e105fbe8aa9dc4c108f6ee8a2bff33568a205e32e639b8382ab2ccb2" Feb 24 10:17:18 crc kubenswrapper[4698]: E0224 10:17:18.577530 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 24 10:17:19 crc kubenswrapper[4698]: I0224 10:17:19.513326 4698 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:17:19 crc kubenswrapper[4698]: I0224 10:17:19.515303 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:19 crc kubenswrapper[4698]: I0224 10:17:19.515452 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:19 crc kubenswrapper[4698]: I0224 10:17:19.515536 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:19 crc kubenswrapper[4698]: I0224 10:17:19.515655 4698 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 24 10:17:19 crc kubenswrapper[4698]: E0224 10:17:19.520394 4698 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Feb 24 10:17:19 crc kubenswrapper[4698]: E0224 10:17:19.520706 4698 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Feb 24 10:17:19 crc kubenswrapper[4698]: I0224 10:17:19.554436 4698 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 24 10:17:20 crc kubenswrapper[4698]: I0224 10:17:20.559950 4698 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 24 10:17:21 crc kubenswrapper[4698]: I0224 10:17:21.429924 4698 csr.go:261] certificate signing request csr-hnq2j is approved, waiting to be issued Feb 24 10:17:21 crc kubenswrapper[4698]: I0224 10:17:21.439876 4698 csr.go:257] certificate signing request csr-hnq2j is issued Feb 24 10:17:21 crc kubenswrapper[4698]: I0224 10:17:21.529247 4698 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Feb 24 10:17:22 crc kubenswrapper[4698]: I0224 10:17:22.404026 4698 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Feb 24 10:17:22 crc kubenswrapper[4698]: I0224 10:17:22.442441 4698 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-11-30 16:33:05.950001491 +0000 UTC Feb 24 10:17:22 crc kubenswrapper[4698]: I0224 10:17:22.442503 4698 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6702h15m43.507504505s for next certificate rotation Feb 24 10:17:25 crc kubenswrapper[4698]: E0224 10:17:25.715333 4698 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 24 10:17:26 crc kubenswrapper[4698]: I0224 10:17:26.520637 4698 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:17:26 crc kubenswrapper[4698]: I0224 10:17:26.521668 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:26 crc kubenswrapper[4698]: I0224 10:17:26.521718 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:26 crc kubenswrapper[4698]: I0224 10:17:26.521735 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:26 crc kubenswrapper[4698]: I0224 10:17:26.521865 4698 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 24 10:17:26 crc kubenswrapper[4698]: I0224 10:17:26.531492 4698 kubelet_node_status.go:115] "Node was previously registered" node="crc" Feb 24 10:17:26 crc kubenswrapper[4698]: I0224 10:17:26.531576 4698 kubelet_node_status.go:79] "Successfully registered node" node="crc" Feb 24 10:17:26 crc kubenswrapper[4698]: E0224 10:17:26.531593 4698 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Feb 24 10:17:26 crc kubenswrapper[4698]: I0224 10:17:26.536901 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:26 crc kubenswrapper[4698]: I0224 10:17:26.536977 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:26 crc kubenswrapper[4698]: I0224 10:17:26.537004 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:26 crc kubenswrapper[4698]: I0224 10:17:26.537037 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:26 crc kubenswrapper[4698]: I0224 10:17:26.537062 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:26Z","lastTransitionTime":"2026-02-24T10:17:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:26 crc kubenswrapper[4698]: E0224 10:17:26.553131 4698 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:17:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:17:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:17:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:17:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b118f46-32f0-479c-9931-37b2bbb76922\\\",\\\"systemUUID\\\":\\\"b9d2441b-c8c3-476a-9c48-bba682d9b98e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 10:17:26 crc kubenswrapper[4698]: I0224 10:17:26.561823 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:26 crc kubenswrapper[4698]: I0224 10:17:26.561899 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:26 crc kubenswrapper[4698]: I0224 10:17:26.561920 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:26 crc kubenswrapper[4698]: I0224 10:17:26.561944 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:26 crc kubenswrapper[4698]: I0224 10:17:26.561962 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:26Z","lastTransitionTime":"2026-02-24T10:17:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:26 crc kubenswrapper[4698]: E0224 10:17:26.571225 4698 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:17:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:17:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:17:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:17:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b118f46-32f0-479c-9931-37b2bbb76922\\\",\\\"systemUUID\\\":\\\"b9d2441b-c8c3-476a-9c48-bba682d9b98e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 10:17:26 crc kubenswrapper[4698]: I0224 10:17:26.577231 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:26 crc kubenswrapper[4698]: I0224 10:17:26.577278 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:26 crc kubenswrapper[4698]: I0224 10:17:26.577289 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:26 crc kubenswrapper[4698]: I0224 10:17:26.577301 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:26 crc kubenswrapper[4698]: I0224 10:17:26.577311 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:26Z","lastTransitionTime":"2026-02-24T10:17:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:26 crc kubenswrapper[4698]: E0224 10:17:26.590079 4698 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:17:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:17:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:17:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:17:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b118f46-32f0-479c-9931-37b2bbb76922\\\",\\\"systemUUID\\\":\\\"b9d2441b-c8c3-476a-9c48-bba682d9b98e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 10:17:26 crc kubenswrapper[4698]: I0224 10:17:26.598060 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:26 crc kubenswrapper[4698]: I0224 10:17:26.598100 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:26 crc kubenswrapper[4698]: I0224 10:17:26.598111 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:26 crc kubenswrapper[4698]: I0224 10:17:26.598125 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:26 crc kubenswrapper[4698]: I0224 10:17:26.598136 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:26Z","lastTransitionTime":"2026-02-24T10:17:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:26 crc kubenswrapper[4698]: E0224 10:17:26.611593 4698 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:17:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:17:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:17:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:17:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b118f46-32f0-479c-9931-37b2bbb76922\\\",\\\"systemUUID\\\":\\\"b9d2441b-c8c3-476a-9c48-bba682d9b98e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 10:17:26 crc kubenswrapper[4698]: E0224 10:17:26.612020 4698 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 24 10:17:26 crc kubenswrapper[4698]: E0224 10:17:26.612051 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:26 crc kubenswrapper[4698]: E0224 10:17:26.713144 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:26 crc kubenswrapper[4698]: E0224 10:17:26.813429 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:26 crc kubenswrapper[4698]: E0224 10:17:26.914642 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:27 crc kubenswrapper[4698]: E0224 10:17:27.015742 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:27 crc kubenswrapper[4698]: E0224 10:17:27.115867 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:27 crc kubenswrapper[4698]: E0224 10:17:27.217074 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:27 crc kubenswrapper[4698]: E0224 10:17:27.317199 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:27 crc kubenswrapper[4698]: E0224 10:17:27.418544 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:27 crc kubenswrapper[4698]: E0224 10:17:27.518840 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:27 crc kubenswrapper[4698]: E0224 10:17:27.619779 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:27 crc kubenswrapper[4698]: E0224 10:17:27.720661 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:27 crc kubenswrapper[4698]: E0224 10:17:27.821813 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:27 crc kubenswrapper[4698]: E0224 10:17:27.922410 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:28 crc kubenswrapper[4698]: E0224 10:17:28.023389 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:28 crc kubenswrapper[4698]: E0224 10:17:28.124131 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:28 crc kubenswrapper[4698]: E0224 10:17:28.224326 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:28 crc kubenswrapper[4698]: E0224 10:17:28.325813 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:28 crc kubenswrapper[4698]: E0224 10:17:28.426798 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:28 crc kubenswrapper[4698]: E0224 10:17:28.527595 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:28 crc kubenswrapper[4698]: E0224 10:17:28.628343 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:28 crc kubenswrapper[4698]: E0224 10:17:28.729369 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:28 crc kubenswrapper[4698]: E0224 10:17:28.829713 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:28 crc kubenswrapper[4698]: E0224 10:17:28.930815 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:29 crc kubenswrapper[4698]: E0224 10:17:29.031303 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:29 crc kubenswrapper[4698]: E0224 10:17:29.131701 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:29 crc kubenswrapper[4698]: E0224 10:17:29.232212 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:29 crc kubenswrapper[4698]: E0224 10:17:29.332467 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:29 crc kubenswrapper[4698]: E0224 10:17:29.433187 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:29 crc kubenswrapper[4698]: E0224 10:17:29.534307 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:29 crc kubenswrapper[4698]: E0224 10:17:29.645971 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:29 crc kubenswrapper[4698]: E0224 10:17:29.746132 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:29 crc kubenswrapper[4698]: E0224 10:17:29.847289 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:29 crc kubenswrapper[4698]: E0224 10:17:29.947712 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:30 crc kubenswrapper[4698]: E0224 10:17:30.047920 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:30 crc kubenswrapper[4698]: E0224 10:17:30.149010 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:30 crc kubenswrapper[4698]: E0224 10:17:30.249871 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:30 crc kubenswrapper[4698]: E0224 10:17:30.350310 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:30 crc kubenswrapper[4698]: E0224 10:17:30.450825 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:30 crc kubenswrapper[4698]: E0224 10:17:30.551288 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:30 crc kubenswrapper[4698]: E0224 10:17:30.651440 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:30 crc kubenswrapper[4698]: E0224 10:17:30.751742 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:30 crc kubenswrapper[4698]: E0224 10:17:30.851970 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:30 crc kubenswrapper[4698]: E0224 10:17:30.952994 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:31 crc kubenswrapper[4698]: E0224 10:17:31.053807 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:31 crc kubenswrapper[4698]: E0224 10:17:31.154625 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:31 crc kubenswrapper[4698]: E0224 10:17:31.255722 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:31 crc kubenswrapper[4698]: E0224 10:17:31.356815 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:31 crc kubenswrapper[4698]: E0224 10:17:31.457491 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:31 crc kubenswrapper[4698]: E0224 10:17:31.558160 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:31 crc kubenswrapper[4698]: E0224 10:17:31.659349 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:31 crc kubenswrapper[4698]: E0224 10:17:31.759718 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:31 crc kubenswrapper[4698]: I0224 10:17:31.821443 4698 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 24 10:17:31 crc kubenswrapper[4698]: E0224 10:17:31.860623 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:31 crc kubenswrapper[4698]: E0224 10:17:31.962427 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:32 crc kubenswrapper[4698]: E0224 10:17:32.063222 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:32 crc kubenswrapper[4698]: E0224 10:17:32.164573 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:32 crc kubenswrapper[4698]: E0224 10:17:32.265151 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:32 crc kubenswrapper[4698]: E0224 10:17:32.365591 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:32 crc kubenswrapper[4698]: E0224 10:17:32.466802 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:32 crc kubenswrapper[4698]: E0224 10:17:32.568421 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:32 crc kubenswrapper[4698]: I0224 10:17:32.614305 4698 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:17:32 crc kubenswrapper[4698]: I0224 10:17:32.615901 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:32 crc kubenswrapper[4698]: I0224 10:17:32.616006 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:32 crc kubenswrapper[4698]: I0224 10:17:32.616033 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:32 crc kubenswrapper[4698]: I0224 10:17:32.617023 4698 scope.go:117] "RemoveContainer" containerID="64b39341e105fbe8aa9dc4c108f6ee8a2bff33568a205e32e639b8382ab2ccb2" Feb 24 10:17:32 crc kubenswrapper[4698]: E0224 10:17:32.617394 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 24 10:17:32 crc kubenswrapper[4698]: E0224 10:17:32.669015 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:32 crc kubenswrapper[4698]: E0224 10:17:32.770066 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:32 crc kubenswrapper[4698]: E0224 10:17:32.871084 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:32 crc kubenswrapper[4698]: E0224 10:17:32.971303 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:33 crc kubenswrapper[4698]: E0224 10:17:33.071780 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:33 crc kubenswrapper[4698]: E0224 10:17:33.172844 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:33 crc kubenswrapper[4698]: E0224 10:17:33.273774 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:33 crc kubenswrapper[4698]: E0224 10:17:33.373935 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:33 crc kubenswrapper[4698]: E0224 10:17:33.474440 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:33 crc kubenswrapper[4698]: E0224 10:17:33.574617 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:33 crc kubenswrapper[4698]: E0224 10:17:33.675327 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:33 crc kubenswrapper[4698]: E0224 10:17:33.776234 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:33 crc kubenswrapper[4698]: E0224 10:17:33.877140 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:33 crc kubenswrapper[4698]: E0224 10:17:33.977810 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:34 crc kubenswrapper[4698]: E0224 10:17:34.079034 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:34 crc kubenswrapper[4698]: E0224 10:17:34.179611 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:34 crc kubenswrapper[4698]: E0224 10:17:34.280141 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:34 crc kubenswrapper[4698]: E0224 10:17:34.380863 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:34 crc kubenswrapper[4698]: E0224 10:17:34.481401 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:34 crc kubenswrapper[4698]: E0224 10:17:34.582391 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:34 crc kubenswrapper[4698]: E0224 10:17:34.682906 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:34 crc kubenswrapper[4698]: E0224 10:17:34.783796 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:34 crc kubenswrapper[4698]: I0224 10:17:34.826056 4698 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 24 10:17:34 crc kubenswrapper[4698]: E0224 10:17:34.884671 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:34 crc kubenswrapper[4698]: E0224 10:17:34.985634 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:35 crc kubenswrapper[4698]: E0224 10:17:35.086068 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:35 crc kubenswrapper[4698]: E0224 10:17:35.186745 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:35 crc kubenswrapper[4698]: E0224 10:17:35.287232 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:35 crc kubenswrapper[4698]: E0224 10:17:35.388003 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:35 crc kubenswrapper[4698]: E0224 10:17:35.488943 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:35 crc kubenswrapper[4698]: E0224 10:17:35.589839 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:35 crc kubenswrapper[4698]: E0224 10:17:35.690571 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:35 crc kubenswrapper[4698]: E0224 10:17:35.716130 4698 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 24 10:17:35 crc kubenswrapper[4698]: E0224 10:17:35.791449 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:35 crc kubenswrapper[4698]: E0224 10:17:35.892375 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:35 crc kubenswrapper[4698]: E0224 10:17:35.992761 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:36 crc kubenswrapper[4698]: E0224 10:17:36.092917 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:36 crc kubenswrapper[4698]: E0224 10:17:36.193529 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:36 crc kubenswrapper[4698]: E0224 10:17:36.294606 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:36 crc kubenswrapper[4698]: E0224 10:17:36.395560 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:36 crc kubenswrapper[4698]: E0224 10:17:36.495721 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:36 crc kubenswrapper[4698]: E0224 10:17:36.596742 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:36 crc kubenswrapper[4698]: E0224 10:17:36.686414 4698 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Feb 24 10:17:36 crc kubenswrapper[4698]: I0224 10:17:36.693227 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:36 crc kubenswrapper[4698]: I0224 10:17:36.693341 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:36 crc kubenswrapper[4698]: I0224 10:17:36.693368 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:36 crc kubenswrapper[4698]: I0224 10:17:36.693402 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:36 crc kubenswrapper[4698]: I0224 10:17:36.693425 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:36Z","lastTransitionTime":"2026-02-24T10:17:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:36 crc kubenswrapper[4698]: E0224 10:17:36.710732 4698 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:17:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:17:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:17:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:17:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b118f46-32f0-479c-9931-37b2bbb76922\\\",\\\"systemUUID\\\":\\\"b9d2441b-c8c3-476a-9c48-bba682d9b98e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 10:17:36 crc kubenswrapper[4698]: I0224 10:17:36.717697 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:36 crc kubenswrapper[4698]: I0224 10:17:36.717781 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:36 crc kubenswrapper[4698]: I0224 10:17:36.717805 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:36 crc kubenswrapper[4698]: I0224 10:17:36.717839 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:36 crc kubenswrapper[4698]: I0224 10:17:36.717870 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:36Z","lastTransitionTime":"2026-02-24T10:17:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:36 crc kubenswrapper[4698]: E0224 10:17:36.735248 4698 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:17:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:17:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:17:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:17:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b118f46-32f0-479c-9931-37b2bbb76922\\\",\\\"systemUUID\\\":\\\"b9d2441b-c8c3-476a-9c48-bba682d9b98e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 10:17:36 crc kubenswrapper[4698]: I0224 10:17:36.740361 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:36 crc kubenswrapper[4698]: I0224 10:17:36.740433 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:36 crc kubenswrapper[4698]: I0224 10:17:36.740458 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:36 crc kubenswrapper[4698]: I0224 10:17:36.740490 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:36 crc kubenswrapper[4698]: I0224 10:17:36.740514 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:36Z","lastTransitionTime":"2026-02-24T10:17:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:36 crc kubenswrapper[4698]: E0224 10:17:36.756343 4698 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:17:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:17:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:17:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:17:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b118f46-32f0-479c-9931-37b2bbb76922\\\",\\\"systemUUID\\\":\\\"b9d2441b-c8c3-476a-9c48-bba682d9b98e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 10:17:36 crc kubenswrapper[4698]: I0224 10:17:36.761554 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:36 crc kubenswrapper[4698]: I0224 10:17:36.761605 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:36 crc kubenswrapper[4698]: I0224 10:17:36.761619 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:36 crc kubenswrapper[4698]: I0224 10:17:36.761641 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:36 crc kubenswrapper[4698]: I0224 10:17:36.761658 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:36Z","lastTransitionTime":"2026-02-24T10:17:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:36 crc kubenswrapper[4698]: E0224 10:17:36.777716 4698 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:17:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:17:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:17:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:17:36Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b118f46-32f0-479c-9931-37b2bbb76922\\\",\\\"systemUUID\\\":\\\"b9d2441b-c8c3-476a-9c48-bba682d9b98e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 10:17:36 crc kubenswrapper[4698]: E0224 10:17:36.778037 4698 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 24 10:17:36 crc kubenswrapper[4698]: E0224 10:17:36.778074 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:36 crc kubenswrapper[4698]: E0224 10:17:36.878240 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:36 crc kubenswrapper[4698]: E0224 10:17:36.978883 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:37 crc kubenswrapper[4698]: E0224 10:17:37.079891 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:37 crc kubenswrapper[4698]: E0224 10:17:37.181009 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:37 crc kubenswrapper[4698]: E0224 10:17:37.281308 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:37 crc kubenswrapper[4698]: E0224 10:17:37.382490 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:37 crc kubenswrapper[4698]: E0224 10:17:37.483634 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:37 crc kubenswrapper[4698]: E0224 10:17:37.584525 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:37 crc kubenswrapper[4698]: E0224 10:17:37.684647 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:37 crc kubenswrapper[4698]: E0224 10:17:37.785150 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:37 crc kubenswrapper[4698]: E0224 10:17:37.886248 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:37 crc kubenswrapper[4698]: E0224 10:17:37.986460 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:38 crc kubenswrapper[4698]: E0224 10:17:38.087328 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:38 crc kubenswrapper[4698]: E0224 10:17:38.187537 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:38 crc kubenswrapper[4698]: E0224 10:17:38.288666 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:38 crc kubenswrapper[4698]: E0224 10:17:38.388892 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:38 crc kubenswrapper[4698]: E0224 10:17:38.489233 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:38 crc kubenswrapper[4698]: E0224 10:17:38.589794 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:38 crc kubenswrapper[4698]: E0224 10:17:38.690515 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:38 crc kubenswrapper[4698]: E0224 10:17:38.791322 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:38 crc kubenswrapper[4698]: E0224 10:17:38.892331 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:38 crc kubenswrapper[4698]: E0224 10:17:38.993445 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:39 crc kubenswrapper[4698]: E0224 10:17:39.093891 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:39 crc kubenswrapper[4698]: E0224 10:17:39.194357 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:39 crc kubenswrapper[4698]: E0224 10:17:39.295089 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:39 crc kubenswrapper[4698]: E0224 10:17:39.395546 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:39 crc kubenswrapper[4698]: E0224 10:17:39.496363 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:39 crc kubenswrapper[4698]: E0224 10:17:39.597452 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:39 crc kubenswrapper[4698]: E0224 10:17:39.698337 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:39 crc kubenswrapper[4698]: E0224 10:17:39.798641 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:39 crc kubenswrapper[4698]: E0224 10:17:39.899738 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:40 crc kubenswrapper[4698]: E0224 10:17:40.000515 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:40 crc kubenswrapper[4698]: E0224 10:17:40.100999 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:40 crc kubenswrapper[4698]: E0224 10:17:40.201963 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:40 crc kubenswrapper[4698]: E0224 10:17:40.302908 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:40 crc kubenswrapper[4698]: E0224 10:17:40.403346 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:40 crc kubenswrapper[4698]: E0224 10:17:40.503752 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:40 crc kubenswrapper[4698]: E0224 10:17:40.604825 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:40 crc kubenswrapper[4698]: E0224 10:17:40.705947 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:40 crc kubenswrapper[4698]: E0224 10:17:40.806354 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:40 crc kubenswrapper[4698]: E0224 10:17:40.906931 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:41 crc kubenswrapper[4698]: E0224 10:17:41.007948 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:41 crc kubenswrapper[4698]: E0224 10:17:41.108401 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:41 crc kubenswrapper[4698]: E0224 10:17:41.209070 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:41 crc kubenswrapper[4698]: E0224 10:17:41.310203 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:41 crc kubenswrapper[4698]: E0224 10:17:41.411234 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:41 crc kubenswrapper[4698]: E0224 10:17:41.511815 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:41 crc kubenswrapper[4698]: E0224 10:17:41.612033 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:41 crc kubenswrapper[4698]: E0224 10:17:41.712598 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:41 crc kubenswrapper[4698]: E0224 10:17:41.813366 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:41 crc kubenswrapper[4698]: E0224 10:17:41.914480 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:42 crc kubenswrapper[4698]: E0224 10:17:42.014666 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:42 crc kubenswrapper[4698]: E0224 10:17:42.114857 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:42 crc kubenswrapper[4698]: E0224 10:17:42.216030 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:42 crc kubenswrapper[4698]: E0224 10:17:42.316797 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:42 crc kubenswrapper[4698]: E0224 10:17:42.417433 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:42 crc kubenswrapper[4698]: E0224 10:17:42.518290 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:42 crc kubenswrapper[4698]: E0224 10:17:42.619211 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:42 crc kubenswrapper[4698]: E0224 10:17:42.719767 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:42 crc kubenswrapper[4698]: E0224 10:17:42.820461 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:42 crc kubenswrapper[4698]: E0224 10:17:42.920945 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:43 crc kubenswrapper[4698]: E0224 10:17:43.021513 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:43 crc kubenswrapper[4698]: E0224 10:17:43.121999 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:43 crc kubenswrapper[4698]: E0224 10:17:43.223070 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:43 crc kubenswrapper[4698]: E0224 10:17:43.324057 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:43 crc kubenswrapper[4698]: E0224 10:17:43.425219 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:43 crc kubenswrapper[4698]: E0224 10:17:43.526197 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:43 crc kubenswrapper[4698]: I0224 10:17:43.614135 4698 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:17:43 crc kubenswrapper[4698]: I0224 10:17:43.616062 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:43 crc kubenswrapper[4698]: I0224 10:17:43.616141 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:43 crc kubenswrapper[4698]: I0224 10:17:43.616161 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:43 crc kubenswrapper[4698]: E0224 10:17:43.626859 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:43 crc kubenswrapper[4698]: E0224 10:17:43.727356 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:43 crc kubenswrapper[4698]: E0224 10:17:43.827528 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:43 crc kubenswrapper[4698]: E0224 10:17:43.927745 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:44 crc kubenswrapper[4698]: E0224 10:17:44.028348 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:44 crc kubenswrapper[4698]: E0224 10:17:44.128930 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:44 crc kubenswrapper[4698]: E0224 10:17:44.229604 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:44 crc kubenswrapper[4698]: E0224 10:17:44.330250 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:44 crc kubenswrapper[4698]: E0224 10:17:44.431227 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:44 crc kubenswrapper[4698]: E0224 10:17:44.532445 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:44 crc kubenswrapper[4698]: I0224 10:17:44.613927 4698 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:17:44 crc kubenswrapper[4698]: I0224 10:17:44.615972 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:44 crc kubenswrapper[4698]: I0224 10:17:44.616052 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:44 crc kubenswrapper[4698]: I0224 10:17:44.616075 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:44 crc kubenswrapper[4698]: E0224 10:17:44.633605 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:44 crc kubenswrapper[4698]: E0224 10:17:44.734375 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:44 crc kubenswrapper[4698]: E0224 10:17:44.834970 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:44 crc kubenswrapper[4698]: E0224 10:17:44.936161 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:45 crc kubenswrapper[4698]: E0224 10:17:45.036519 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:45 crc kubenswrapper[4698]: E0224 10:17:45.137368 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:45 crc kubenswrapper[4698]: E0224 10:17:45.238029 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:45 crc kubenswrapper[4698]: E0224 10:17:45.338879 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:45 crc kubenswrapper[4698]: E0224 10:17:45.439537 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:45 crc kubenswrapper[4698]: E0224 10:17:45.540064 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:45 crc kubenswrapper[4698]: I0224 10:17:45.614492 4698 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 24 10:17:45 crc kubenswrapper[4698]: I0224 10:17:45.616510 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:45 crc kubenswrapper[4698]: I0224 10:17:45.616580 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:45 crc kubenswrapper[4698]: I0224 10:17:45.616608 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:45 crc kubenswrapper[4698]: I0224 10:17:45.617884 4698 scope.go:117] "RemoveContainer" containerID="64b39341e105fbe8aa9dc4c108f6ee8a2bff33568a205e32e639b8382ab2ccb2" Feb 24 10:17:45 crc kubenswrapper[4698]: E0224 10:17:45.618410 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 24 10:17:45 crc kubenswrapper[4698]: E0224 10:17:45.640631 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:45 crc kubenswrapper[4698]: E0224 10:17:45.716849 4698 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 24 10:17:45 crc kubenswrapper[4698]: E0224 10:17:45.741818 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:45 crc kubenswrapper[4698]: E0224 10:17:45.842688 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:45 crc kubenswrapper[4698]: E0224 10:17:45.943033 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:46 crc kubenswrapper[4698]: E0224 10:17:46.043379 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:46 crc kubenswrapper[4698]: E0224 10:17:46.143746 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:46 crc kubenswrapper[4698]: E0224 10:17:46.244789 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:46 crc kubenswrapper[4698]: E0224 10:17:46.345938 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:46 crc kubenswrapper[4698]: E0224 10:17:46.447069 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:46 crc kubenswrapper[4698]: E0224 10:17:46.547962 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:46 crc kubenswrapper[4698]: E0224 10:17:46.648458 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:46 crc kubenswrapper[4698]: E0224 10:17:46.749002 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:46 crc kubenswrapper[4698]: E0224 10:17:46.849827 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:46 crc kubenswrapper[4698]: E0224 10:17:46.950593 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:47 crc kubenswrapper[4698]: E0224 10:17:47.050962 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:47 crc kubenswrapper[4698]: I0224 10:17:47.113017 4698 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 24 10:17:47 crc kubenswrapper[4698]: E0224 10:17:47.121757 4698 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Feb 24 10:17:47 crc kubenswrapper[4698]: I0224 10:17:47.128527 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:47 crc kubenswrapper[4698]: I0224 10:17:47.128615 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:47 crc kubenswrapper[4698]: I0224 10:17:47.128641 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:47 crc kubenswrapper[4698]: I0224 10:17:47.128673 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:47 crc kubenswrapper[4698]: I0224 10:17:47.128695 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:47Z","lastTransitionTime":"2026-02-24T10:17:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:47 crc kubenswrapper[4698]: E0224 10:17:47.147972 4698 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:17:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:17:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:17:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:17:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b118f46-32f0-479c-9931-37b2bbb76922\\\",\\\"systemUUID\\\":\\\"b9d2441b-c8c3-476a-9c48-bba682d9b98e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 10:17:47 crc kubenswrapper[4698]: I0224 10:17:47.160624 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:47 crc kubenswrapper[4698]: I0224 10:17:47.160794 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:47 crc kubenswrapper[4698]: I0224 10:17:47.160824 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:47 crc kubenswrapper[4698]: I0224 10:17:47.160899 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:47 crc kubenswrapper[4698]: I0224 10:17:47.161002 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:47Z","lastTransitionTime":"2026-02-24T10:17:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:47 crc kubenswrapper[4698]: E0224 10:17:47.178300 4698 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:17:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:17:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:17:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:17:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b118f46-32f0-479c-9931-37b2bbb76922\\\",\\\"systemUUID\\\":\\\"b9d2441b-c8c3-476a-9c48-bba682d9b98e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 10:17:47 crc kubenswrapper[4698]: I0224 10:17:47.206384 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:47 crc kubenswrapper[4698]: I0224 10:17:47.206453 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:47 crc kubenswrapper[4698]: I0224 10:17:47.206471 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:47 crc kubenswrapper[4698]: I0224 10:17:47.206499 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:47 crc kubenswrapper[4698]: I0224 10:17:47.206515 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:47Z","lastTransitionTime":"2026-02-24T10:17:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:47 crc kubenswrapper[4698]: E0224 10:17:47.224866 4698 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:17:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:17:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:17:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:17:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b118f46-32f0-479c-9931-37b2bbb76922\\\",\\\"systemUUID\\\":\\\"b9d2441b-c8c3-476a-9c48-bba682d9b98e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 10:17:47 crc kubenswrapper[4698]: I0224 10:17:47.230148 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:47 crc kubenswrapper[4698]: I0224 10:17:47.230214 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:47 crc kubenswrapper[4698]: I0224 10:17:47.230227 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:47 crc kubenswrapper[4698]: I0224 10:17:47.230253 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:47 crc kubenswrapper[4698]: I0224 10:17:47.230299 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:47Z","lastTransitionTime":"2026-02-24T10:17:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:47 crc kubenswrapper[4698]: E0224 10:17:47.246442 4698 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:17:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:17:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:17:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:17:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b118f46-32f0-479c-9931-37b2bbb76922\\\",\\\"systemUUID\\\":\\\"b9d2441b-c8c3-476a-9c48-bba682d9b98e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 10:17:47 crc kubenswrapper[4698]: E0224 10:17:47.246799 4698 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 24 10:17:47 crc kubenswrapper[4698]: E0224 10:17:47.246861 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:47 crc kubenswrapper[4698]: E0224 10:17:47.347431 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:47 crc kubenswrapper[4698]: E0224 10:17:47.447830 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:47 crc kubenswrapper[4698]: E0224 10:17:47.548566 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:47 crc kubenswrapper[4698]: E0224 10:17:47.649612 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:47 crc kubenswrapper[4698]: E0224 10:17:47.750679 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:47 crc kubenswrapper[4698]: E0224 10:17:47.851893 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:47 crc kubenswrapper[4698]: E0224 10:17:47.952402 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:48 crc kubenswrapper[4698]: E0224 10:17:48.052972 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:48 crc kubenswrapper[4698]: E0224 10:17:48.153395 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:48 crc kubenswrapper[4698]: E0224 10:17:48.253842 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:48 crc kubenswrapper[4698]: E0224 10:17:48.354187 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:48 crc kubenswrapper[4698]: E0224 10:17:48.454336 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:48 crc kubenswrapper[4698]: E0224 10:17:48.555748 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:48 crc kubenswrapper[4698]: E0224 10:17:48.655902 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:48 crc kubenswrapper[4698]: E0224 10:17:48.756072 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:48 crc kubenswrapper[4698]: E0224 10:17:48.856201 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:48 crc kubenswrapper[4698]: E0224 10:17:48.956325 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:49 crc kubenswrapper[4698]: E0224 10:17:49.056805 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:49 crc kubenswrapper[4698]: E0224 10:17:49.156996 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:49 crc kubenswrapper[4698]: E0224 10:17:49.257743 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:49 crc kubenswrapper[4698]: E0224 10:17:49.358251 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:49 crc kubenswrapper[4698]: E0224 10:17:49.458923 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:49 crc kubenswrapper[4698]: E0224 10:17:49.559713 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:49 crc kubenswrapper[4698]: E0224 10:17:49.660398 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:49 crc kubenswrapper[4698]: E0224 10:17:49.760729 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:49 crc kubenswrapper[4698]: E0224 10:17:49.861609 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:49 crc kubenswrapper[4698]: E0224 10:17:49.962618 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:50 crc kubenswrapper[4698]: E0224 10:17:50.063710 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:50 crc kubenswrapper[4698]: E0224 10:17:50.164661 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:50 crc kubenswrapper[4698]: E0224 10:17:50.264956 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:50 crc kubenswrapper[4698]: E0224 10:17:50.365092 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:50 crc kubenswrapper[4698]: E0224 10:17:50.465885 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:50 crc kubenswrapper[4698]: E0224 10:17:50.566498 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:50 crc kubenswrapper[4698]: E0224 10:17:50.666893 4698 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 24 10:17:50 crc kubenswrapper[4698]: I0224 10:17:50.667357 4698 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 24 10:17:50 crc kubenswrapper[4698]: I0224 10:17:50.769198 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:50 crc kubenswrapper[4698]: I0224 10:17:50.769322 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:50 crc kubenswrapper[4698]: I0224 10:17:50.769348 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:50 crc kubenswrapper[4698]: I0224 10:17:50.769382 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:50 crc kubenswrapper[4698]: I0224 10:17:50.769401 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:50Z","lastTransitionTime":"2026-02-24T10:17:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:50 crc kubenswrapper[4698]: I0224 10:17:50.871763 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:50 crc kubenswrapper[4698]: I0224 10:17:50.871817 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:50 crc kubenswrapper[4698]: I0224 10:17:50.871834 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:50 crc kubenswrapper[4698]: I0224 10:17:50.871855 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:50 crc kubenswrapper[4698]: I0224 10:17:50.871871 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:50Z","lastTransitionTime":"2026-02-24T10:17:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:50 crc kubenswrapper[4698]: I0224 10:17:50.974782 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:50 crc kubenswrapper[4698]: I0224 10:17:50.974884 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:50 crc kubenswrapper[4698]: I0224 10:17:50.974982 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:50 crc kubenswrapper[4698]: I0224 10:17:50.975016 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:50 crc kubenswrapper[4698]: I0224 10:17:50.975039 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:50Z","lastTransitionTime":"2026-02-24T10:17:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.077544 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.077591 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.077603 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.077626 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.077638 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:51Z","lastTransitionTime":"2026-02-24T10:17:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.181063 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.181136 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.181154 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.181178 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.181197 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:51Z","lastTransitionTime":"2026-02-24T10:17:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.285718 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.285794 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.285813 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.285845 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.285866 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:51Z","lastTransitionTime":"2026-02-24T10:17:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.389828 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.389906 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.389919 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.389944 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.389960 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:51Z","lastTransitionTime":"2026-02-24T10:17:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.493209 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.493302 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.493329 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.493358 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.493380 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:51Z","lastTransitionTime":"2026-02-24T10:17:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.590886 4698 apiserver.go:52] "Watching apiserver" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.596090 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.596155 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.596173 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.596204 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.596222 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:51Z","lastTransitionTime":"2026-02-24T10:17:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.597517 4698 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.597982 4698 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-29rvz","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.598642 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.598784 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.598877 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.598944 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.599003 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 24 10:17:51 crc kubenswrapper[4698]: E0224 10:17:51.598998 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:17:51 crc kubenswrapper[4698]: E0224 10:17:51.599062 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.600615 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-29rvz" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.600735 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:17:51 crc kubenswrapper[4698]: E0224 10:17:51.600812 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.602448 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.602561 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.605826 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.607352 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.607580 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.605973 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.607980 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.610561 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.610622 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.610949 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.611083 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.611443 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.638124 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.656685 4698 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.657995 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.678415 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.693613 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.699130 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.699187 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.699207 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.699230 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.699247 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:51Z","lastTransitionTime":"2026-02-24T10:17:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.710625 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.725217 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.735448 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.735517 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.735569 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.735611 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.735658 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.735708 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.735755 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.735797 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.736332 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.736389 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.736590 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.736678 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.736722 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.737083 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.737224 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.737342 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.737414 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.737432 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.737587 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.737780 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.737818 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.737847 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.737975 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.738492 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.738716 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.738955 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.739321 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.739453 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.739450 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.739866 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.739918 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.739974 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.740224 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.740638 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.740688 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.740717 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.740784 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.741047 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.741202 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.741374 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.740812 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.741511 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.741556 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.741601 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.741636 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.741677 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.741713 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.741749 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.741791 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.741826 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.741861 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.741894 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.741927 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.741962 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.742010 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.742046 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.742083 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.742114 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.742146 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.742179 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.742188 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.742213 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.742168 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.742251 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.742314 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.742312 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.742347 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.742514 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.742575 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.742628 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.742561 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.742677 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.742725 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.742739 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.742770 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.742820 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.742872 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.742928 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.742980 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.743024 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.743070 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.743124 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.743174 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.743224 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.743314 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.743364 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.743548 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.743600 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.743643 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.743687 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.743733 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.743777 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.743826 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.743875 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.743920 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.743969 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.744016 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.744064 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.744116 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.744165 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.744219 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.744304 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.744357 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.744395 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.744466 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.744503 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.745718 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.745877 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.745945 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.745980 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.746014 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.746047 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.746080 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.746117 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.746152 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.746189 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.746226 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.746304 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.746351 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.746390 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.746426 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.746459 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.746492 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.746538 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.746580 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.746615 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.746651 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.746684 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.746717 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.746749 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.746783 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.746822 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.746877 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.746933 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.746987 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.747037 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.747094 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.747135 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.747170 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.747207 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.747244 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.747327 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.747367 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.747401 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.747435 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.747473 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.747508 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.747541 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.747576 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.747610 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.747647 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.747680 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.747716 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.747750 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.742804 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.742786 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.743082 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.743343 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.743432 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.743612 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.744600 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.744752 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.745385 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.745533 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.745584 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.745846 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.746471 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.746835 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.746866 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.747297 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.747461 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.747746 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.747776 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.747917 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.747785 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.748231 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.748248 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.748345 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.748414 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.748504 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.748519 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.748566 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.748617 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.748669 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.748821 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.748886 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.748933 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.748991 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.749041 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.749090 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.749138 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.749172 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.749205 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.749241 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.749311 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.749350 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.749383 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.749420 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.749458 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.749509 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.749543 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.749578 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.749613 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.749648 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.749682 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.749717 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.749749 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.749783 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.749817 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.749855 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.749909 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.749962 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.750012 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.750068 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.750119 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.750177 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.750225 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.750318 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.750376 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.750434 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.750491 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.750549 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.750600 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.750650 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.750704 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.750757 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.750809 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.750860 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.750916 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.750983 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.751037 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.751095 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.751150 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.751340 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.751398 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.751439 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.751473 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.751509 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.751549 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.751589 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.751622 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.751675 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.751772 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.751820 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.751870 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.751933 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.752008 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.752069 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.752126 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.752177 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.752230 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.752342 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.752390 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.752465 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.752504 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.752554 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.748826 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.752610 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9cba56db-d46e-4a34-9863-47e4dce27ca5-hosts-file\") pod \"node-resolver-29rvz\" (UID: \"9cba56db-d46e-4a34-9863-47e4dce27ca5\") " pod="openshift-dns/node-resolver-29rvz" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.752679 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fk9xv\" (UniqueName: \"kubernetes.io/projected/9cba56db-d46e-4a34-9863-47e4dce27ca5-kube-api-access-fk9xv\") pod \"node-resolver-29rvz\" (UID: \"9cba56db-d46e-4a34-9863-47e4dce27ca5\") " pod="openshift-dns/node-resolver-29rvz" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.752817 4698 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.752853 4698 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.752885 4698 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.752913 4698 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.752933 4698 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.752955 4698 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.752975 4698 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.752995 4698 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.753024 4698 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.753053 4698 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.753082 4698 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.753115 4698 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.753144 4698 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.753172 4698 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.753200 4698 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.753228 4698 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.753253 4698 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.753325 4698 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.753356 4698 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.753391 4698 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.753420 4698 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.753448 4698 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.753476 4698 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.753506 4698 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.753535 4698 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.753561 4698 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.753587 4698 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.753615 4698 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.753646 4698 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.753677 4698 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.753706 4698 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.753734 4698 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.753762 4698 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.753793 4698 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.753823 4698 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.753856 4698 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.753888 4698 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.753919 4698 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.753950 4698 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.753980 4698 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.754009 4698 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.754038 4698 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.754066 4698 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.754097 4698 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.754125 4698 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.754153 4698 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.754181 4698 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.762513 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.764863 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.770254 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.771977 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.749563 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.750327 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.750307 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.750236 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.750664 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.751052 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.751459 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.751913 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.751808 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.752066 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.752332 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.752638 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.752757 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.771134 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.772939 4698 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.753315 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.753821 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.773903 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.754099 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.754231 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.755422 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.755629 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.755806 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.756359 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.756896 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.757098 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.757457 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.758196 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.758324 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.758369 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: E0224 10:17:51.758399 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:17:52.258365034 +0000 UTC m=+97.371979305 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.758469 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.759011 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.760017 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.760446 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.761158 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.761882 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.761887 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.761909 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.761951 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.762590 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.762746 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.762545 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.763546 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.763772 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.763799 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.763860 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.764290 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.764299 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.764245 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.764549 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.764557 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.764731 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.765047 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.765072 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.765286 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.765532 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.765609 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.765663 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.765982 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.768730 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.768872 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.769026 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.769136 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.769181 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: E0224 10:17:51.769199 4698 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.769395 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.769585 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.769812 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.770025 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.770033 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.770314 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: E0224 10:17:51.770391 4698 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.771164 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.771335 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.771840 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.772880 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.772916 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: E0224 10:17:51.774968 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-24 10:17:52.274938729 +0000 UTC m=+97.388553000 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 24 10:17:51 crc kubenswrapper[4698]: E0224 10:17:51.775621 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-24 10:17:52.275536813 +0000 UTC m=+97.389151104 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.776382 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.777380 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.777939 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.778752 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.779845 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.780185 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.780564 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.781736 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.781842 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.781962 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-29rvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9cba56db-d46e-4a34-9863-47e4dce27ca5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fk9xv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-29rvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.784515 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.784708 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: E0224 10:17:51.784784 4698 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.784866 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: E0224 10:17:51.784996 4698 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 24 10:17:51 crc kubenswrapper[4698]: E0224 10:17:51.785073 4698 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 10:17:51 crc kubenswrapper[4698]: E0224 10:17:51.785210 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-24 10:17:52.285189529 +0000 UTC m=+97.398803780 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.787513 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.787954 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.788377 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.790665 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.791654 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.791682 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.792233 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.793359 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.793426 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.793537 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.794191 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.794250 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.794287 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.793591 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.794809 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.794927 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.795097 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.795143 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.795303 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.795317 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: E0224 10:17:51.795747 4698 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 24 10:17:51 crc kubenswrapper[4698]: E0224 10:17:51.795793 4698 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.795796 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: E0224 10:17:51.795826 4698 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 10:17:51 crc kubenswrapper[4698]: E0224 10:17:51.795926 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-24 10:17:52.29588941 +0000 UTC m=+97.409503731 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.795950 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.795979 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.796040 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.796131 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.796976 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.798244 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.799626 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.801610 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.801959 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.802521 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.802553 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.802568 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.802592 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.802607 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:51Z","lastTransitionTime":"2026-02-24T10:17:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.802628 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.806301 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.807735 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.807732 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.807912 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.807914 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.807965 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.808058 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.808078 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.808533 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.808602 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.808834 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.809053 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.809331 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.809393 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.809594 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.810014 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.810088 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.810366 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.810597 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.810680 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.810777 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.810835 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.811478 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.811518 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.815061 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.815323 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.815905 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.816011 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.816312 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.820667 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.821427 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.823719 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.836303 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.850585 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.854121 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.855532 4698 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-nn578"] Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.861727 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.861808 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.861862 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9cba56db-d46e-4a34-9863-47e4dce27ca5-hosts-file\") pod \"node-resolver-29rvz\" (UID: \"9cba56db-d46e-4a34-9863-47e4dce27ca5\") " pod="openshift-dns/node-resolver-29rvz" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.861908 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fk9xv\" (UniqueName: \"kubernetes.io/projected/9cba56db-d46e-4a34-9863-47e4dce27ca5-kube-api-access-fk9xv\") pod \"node-resolver-29rvz\" (UID: \"9cba56db-d46e-4a34-9863-47e4dce27ca5\") " pod="openshift-dns/node-resolver-29rvz" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.862109 4698 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.862133 4698 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.862148 4698 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.862162 4698 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.862176 4698 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.862196 4698 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.862209 4698 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.862221 4698 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.862238 4698 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.862250 4698 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.862287 4698 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.862300 4698 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.862316 4698 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.862329 4698 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.862342 4698 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.862356 4698 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.862375 4698 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.862388 4698 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.862401 4698 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.862413 4698 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.862429 4698 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.862442 4698 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.862458 4698 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.862524 4698 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.862489 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.862538 4698 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.862609 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9cba56db-d46e-4a34-9863-47e4dce27ca5-hosts-file\") pod \"node-resolver-29rvz\" (UID: \"9cba56db-d46e-4a34-9863-47e4dce27ca5\") " pod="openshift-dns/node-resolver-29rvz" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.862650 4698 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.862706 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.862752 4698 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.862780 4698 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.862842 4698 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.862881 4698 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.862910 4698 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.862953 4698 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.862982 4698 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.863013 4698 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.863053 4698 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.863081 4698 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.863109 4698 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.863137 4698 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.863179 4698 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.863206 4698 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.863232 4698 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.863292 4698 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.863332 4698 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.863358 4698 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.863386 4698 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.863414 4698 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.863455 4698 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.863487 4698 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.863516 4698 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.863553 4698 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.863580 4698 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.863608 4698 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.863634 4698 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.863671 4698 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.863697 4698 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.863723 4698 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.863749 4698 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.863786 4698 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.863816 4698 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.863842 4698 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.863869 4698 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.863908 4698 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.863936 4698 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.863963 4698 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.863999 4698 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.864026 4698 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.864053 4698 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.864079 4698 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.864114 4698 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.864142 4698 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.864168 4698 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.864193 4698 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.864228 4698 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.864255 4698 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.864331 4698 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.864384 4698 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.864427 4698 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.864452 4698 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.864486 4698 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.864508 4698 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.864529 4698 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.864550 4698 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.864580 4698 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.864600 4698 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.864620 4698 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.864645 4698 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.864665 4698 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.864685 4698 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.864705 4698 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.864733 4698 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.864752 4698 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.864773 4698 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.864795 4698 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.864822 4698 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.864841 4698 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.864861 4698 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.864881 4698 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.864908 4698 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.864926 4698 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.864948 4698 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.864972 4698 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.864992 4698 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.865284 4698 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.865314 4698 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.865332 4698 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.865355 4698 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.865378 4698 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.865397 4698 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.865416 4698 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.865443 4698 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.865456 4698 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.865469 4698 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.865482 4698 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.865645 4698 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.865661 4698 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.865675 4698 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.865742 4698 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.865762 4698 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.865779 4698 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.865798 4698 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.865821 4698 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.865835 4698 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.865848 4698 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.865860 4698 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.865877 4698 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.865890 4698 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.865903 4698 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.865916 4698 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.865934 4698 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.865946 4698 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.865958 4698 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.865976 4698 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.865988 4698 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.866000 4698 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.866012 4698 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.866027 4698 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.866041 4698 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.866053 4698 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.866066 4698 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.866087 4698 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.866103 4698 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.866121 4698 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.866139 4698 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.866154 4698 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.866167 4698 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.866179 4698 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.866195 4698 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.871170 4698 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-7mbk6"] Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.871228 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-nn578" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.872368 4698 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-jlg97"] Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.872731 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-7mbk6" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.877675 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.877773 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.877826 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.878006 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.878097 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.878220 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.878233 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.878409 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.879086 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.881764 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.882555 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-jlg97" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.884117 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.884692 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.886072 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fk9xv\" (UniqueName: \"kubernetes.io/projected/9cba56db-d46e-4a34-9863-47e4dce27ca5-kube-api-access-fk9xv\") pod \"node-resolver-29rvz\" (UID: \"9cba56db-d46e-4a34-9863-47e4dce27ca5\") " pod="openshift-dns/node-resolver-29rvz" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.891734 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.906966 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.907094 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.907135 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.907154 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.907181 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.907200 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:51Z","lastTransitionTime":"2026-02-24T10:17:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.918018 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.930148 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.931245 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.941936 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.945206 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.951165 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-29rvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9cba56db-d46e-4a34-9863-47e4dce27ca5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fk9xv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-29rvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.963905 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 24 10:17:51 crc kubenswrapper[4698]: W0224 10:17:51.965582 4698 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-67e318bf06c461c9c91ccc62f4dd71a79474a75fb95f86a35ef79765907b1df6 WatchSource:0}: Error finding container 67e318bf06c461c9c91ccc62f4dd71a79474a75fb95f86a35ef79765907b1df6: Status 404 returned error can't find the container with id 67e318bf06c461c9c91ccc62f4dd71a79474a75fb95f86a35ef79765907b1df6 Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.966961 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/90062989-bf1b-4479-89a0-f3bf0d438ac3-cnibin\") pod \"multus-additional-cni-plugins-jlg97\" (UID: \"90062989-bf1b-4479-89a0-f3bf0d438ac3\") " pod="openshift-multus/multus-additional-cni-plugins-jlg97" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.967128 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/17dd9ce8-b1ca-4810-85fe-9775919eb4b5-multus-socket-dir-parent\") pod \"multus-7mbk6\" (UID: \"17dd9ce8-b1ca-4810-85fe-9775919eb4b5\") " pod="openshift-multus/multus-7mbk6" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.967334 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/17dd9ce8-b1ca-4810-85fe-9775919eb4b5-host-var-lib-kubelet\") pod \"multus-7mbk6\" (UID: \"17dd9ce8-b1ca-4810-85fe-9775919eb4b5\") " pod="openshift-multus/multus-7mbk6" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.967481 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/b4ee0bb1-125d-4852-a54d-7dadf6177545-rootfs\") pod \"machine-config-daemon-nn578\" (UID: \"b4ee0bb1-125d-4852-a54d-7dadf6177545\") " pod="openshift-machine-config-operator/machine-config-daemon-nn578" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.967326 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-nn578" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4ee0bb1-125d-4852-a54d-7dadf6177545\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9ngd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9ngd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-nn578\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.967633 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/17dd9ce8-b1ca-4810-85fe-9775919eb4b5-os-release\") pod \"multus-7mbk6\" (UID: \"17dd9ce8-b1ca-4810-85fe-9775919eb4b5\") " pod="openshift-multus/multus-7mbk6" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.967950 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b4ee0bb1-125d-4852-a54d-7dadf6177545-mcd-auth-proxy-config\") pod \"machine-config-daemon-nn578\" (UID: \"b4ee0bb1-125d-4852-a54d-7dadf6177545\") " pod="openshift-machine-config-operator/machine-config-daemon-nn578" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.968084 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9ngd\" (UniqueName: \"kubernetes.io/projected/b4ee0bb1-125d-4852-a54d-7dadf6177545-kube-api-access-m9ngd\") pod \"machine-config-daemon-nn578\" (UID: \"b4ee0bb1-125d-4852-a54d-7dadf6177545\") " pod="openshift-machine-config-operator/machine-config-daemon-nn578" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.968421 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/90062989-bf1b-4479-89a0-f3bf0d438ac3-system-cni-dir\") pod \"multus-additional-cni-plugins-jlg97\" (UID: \"90062989-bf1b-4479-89a0-f3bf0d438ac3\") " pod="openshift-multus/multus-additional-cni-plugins-jlg97" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.968592 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/17dd9ce8-b1ca-4810-85fe-9775919eb4b5-cnibin\") pod \"multus-7mbk6\" (UID: \"17dd9ce8-b1ca-4810-85fe-9775919eb4b5\") " pod="openshift-multus/multus-7mbk6" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.968730 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/17dd9ce8-b1ca-4810-85fe-9775919eb4b5-host-var-lib-cni-bin\") pod \"multus-7mbk6\" (UID: \"17dd9ce8-b1ca-4810-85fe-9775919eb4b5\") " pod="openshift-multus/multus-7mbk6" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.968872 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/17dd9ce8-b1ca-4810-85fe-9775919eb4b5-multus-daemon-config\") pod \"multus-7mbk6\" (UID: \"17dd9ce8-b1ca-4810-85fe-9775919eb4b5\") " pod="openshift-multus/multus-7mbk6" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.969034 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/17dd9ce8-b1ca-4810-85fe-9775919eb4b5-etc-kubernetes\") pod \"multus-7mbk6\" (UID: \"17dd9ce8-b1ca-4810-85fe-9775919eb4b5\") " pod="openshift-multus/multus-7mbk6" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.969257 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/17dd9ce8-b1ca-4810-85fe-9775919eb4b5-hostroot\") pod \"multus-7mbk6\" (UID: \"17dd9ce8-b1ca-4810-85fe-9775919eb4b5\") " pod="openshift-multus/multus-7mbk6" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.969571 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/17dd9ce8-b1ca-4810-85fe-9775919eb4b5-multus-conf-dir\") pod \"multus-7mbk6\" (UID: \"17dd9ce8-b1ca-4810-85fe-9775919eb4b5\") " pod="openshift-multus/multus-7mbk6" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.969746 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/90062989-bf1b-4479-89a0-f3bf0d438ac3-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-jlg97\" (UID: \"90062989-bf1b-4479-89a0-f3bf0d438ac3\") " pod="openshift-multus/multus-additional-cni-plugins-jlg97" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.969937 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/90062989-bf1b-4479-89a0-f3bf0d438ac3-tuning-conf-dir\") pod \"multus-additional-cni-plugins-jlg97\" (UID: \"90062989-bf1b-4479-89a0-f3bf0d438ac3\") " pod="openshift-multus/multus-additional-cni-plugins-jlg97" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.970176 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/17dd9ce8-b1ca-4810-85fe-9775919eb4b5-host-run-multus-certs\") pod \"multus-7mbk6\" (UID: \"17dd9ce8-b1ca-4810-85fe-9775919eb4b5\") " pod="openshift-multus/multus-7mbk6" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.970342 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/90062989-bf1b-4479-89a0-f3bf0d438ac3-cni-binary-copy\") pod \"multus-additional-cni-plugins-jlg97\" (UID: \"90062989-bf1b-4479-89a0-f3bf0d438ac3\") " pod="openshift-multus/multus-additional-cni-plugins-jlg97" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.970474 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/90062989-bf1b-4479-89a0-f3bf0d438ac3-os-release\") pod \"multus-additional-cni-plugins-jlg97\" (UID: \"90062989-bf1b-4479-89a0-f3bf0d438ac3\") " pod="openshift-multus/multus-additional-cni-plugins-jlg97" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.970574 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/17dd9ce8-b1ca-4810-85fe-9775919eb4b5-system-cni-dir\") pod \"multus-7mbk6\" (UID: \"17dd9ce8-b1ca-4810-85fe-9775919eb4b5\") " pod="openshift-multus/multus-7mbk6" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.970679 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/17dd9ce8-b1ca-4810-85fe-9775919eb4b5-host-run-netns\") pod \"multus-7mbk6\" (UID: \"17dd9ce8-b1ca-4810-85fe-9775919eb4b5\") " pod="openshift-multus/multus-7mbk6" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.971112 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/17dd9ce8-b1ca-4810-85fe-9775919eb4b5-host-var-lib-cni-multus\") pod \"multus-7mbk6\" (UID: \"17dd9ce8-b1ca-4810-85fe-9775919eb4b5\") " pod="openshift-multus/multus-7mbk6" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.971312 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxqkd\" (UniqueName: \"kubernetes.io/projected/90062989-bf1b-4479-89a0-f3bf0d438ac3-kube-api-access-jxqkd\") pod \"multus-additional-cni-plugins-jlg97\" (UID: \"90062989-bf1b-4479-89a0-f3bf0d438ac3\") " pod="openshift-multus/multus-additional-cni-plugins-jlg97" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.971499 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/17dd9ce8-b1ca-4810-85fe-9775919eb4b5-cni-binary-copy\") pod \"multus-7mbk6\" (UID: \"17dd9ce8-b1ca-4810-85fe-9775919eb4b5\") " pod="openshift-multus/multus-7mbk6" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.971637 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/17dd9ce8-b1ca-4810-85fe-9775919eb4b5-host-run-k8s-cni-cncf-io\") pod \"multus-7mbk6\" (UID: \"17dd9ce8-b1ca-4810-85fe-9775919eb4b5\") " pod="openshift-multus/multus-7mbk6" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.971788 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgnjd\" (UniqueName: \"kubernetes.io/projected/17dd9ce8-b1ca-4810-85fe-9775919eb4b5-kube-api-access-tgnjd\") pod \"multus-7mbk6\" (UID: \"17dd9ce8-b1ca-4810-85fe-9775919eb4b5\") " pod="openshift-multus/multus-7mbk6" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.971953 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b4ee0bb1-125d-4852-a54d-7dadf6177545-proxy-tls\") pod \"machine-config-daemon-nn578\" (UID: \"b4ee0bb1-125d-4852-a54d-7dadf6177545\") " pod="openshift-machine-config-operator/machine-config-daemon-nn578" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.972159 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/17dd9ce8-b1ca-4810-85fe-9775919eb4b5-multus-cni-dir\") pod \"multus-7mbk6\" (UID: \"17dd9ce8-b1ca-4810-85fe-9775919eb4b5\") " pod="openshift-multus/multus-7mbk6" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.977119 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-29rvz" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.980646 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 10:17:51 crc kubenswrapper[4698]: I0224 10:17:51.996441 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7mbk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17dd9ce8-b1ca-4810-85fe-9775919eb4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgnjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7mbk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 10:17:52 crc kubenswrapper[4698]: W0224 10:17:52.004837 4698 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9cba56db_d46e_4a34_9863_47e4dce27ca5.slice/crio-36dfb0928f1c9baae553b6223d27b31b4206a41bc489a7621a1ed495c81b490f WatchSource:0}: Error finding container 36dfb0928f1c9baae553b6223d27b31b4206a41bc489a7621a1ed495c81b490f: Status 404 returned error can't find the container with id 36dfb0928f1c9baae553b6223d27b31b4206a41bc489a7621a1ed495c81b490f Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.009845 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.009882 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.009901 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.009926 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.009946 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:52Z","lastTransitionTime":"2026-02-24T10:17:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.009959 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"64be639b56eb0cecc5317519a8931e277abb629098886d8a60fb75ef3bba1322"} Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.011873 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"67e318bf06c461c9c91ccc62f4dd71a79474a75fb95f86a35ef79765907b1df6"} Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.014433 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"5067e256a0553502f6c2190dd2c74ad43ba92f14ace6a93268c876892071fbb1"} Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.017027 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.035729 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.048932 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-nn578" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4ee0bb1-125d-4852-a54d-7dadf6177545\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9ngd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9ngd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-nn578\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.063398 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.072934 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9ngd\" (UniqueName: \"kubernetes.io/projected/b4ee0bb1-125d-4852-a54d-7dadf6177545-kube-api-access-m9ngd\") pod \"machine-config-daemon-nn578\" (UID: \"b4ee0bb1-125d-4852-a54d-7dadf6177545\") " pod="openshift-machine-config-operator/machine-config-daemon-nn578" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.072999 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/17dd9ce8-b1ca-4810-85fe-9775919eb4b5-multus-daemon-config\") pod \"multus-7mbk6\" (UID: \"17dd9ce8-b1ca-4810-85fe-9775919eb4b5\") " pod="openshift-multus/multus-7mbk6" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.073038 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/17dd9ce8-b1ca-4810-85fe-9775919eb4b5-etc-kubernetes\") pod \"multus-7mbk6\" (UID: \"17dd9ce8-b1ca-4810-85fe-9775919eb4b5\") " pod="openshift-multus/multus-7mbk6" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.073113 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/90062989-bf1b-4479-89a0-f3bf0d438ac3-system-cni-dir\") pod \"multus-additional-cni-plugins-jlg97\" (UID: \"90062989-bf1b-4479-89a0-f3bf0d438ac3\") " pod="openshift-multus/multus-additional-cni-plugins-jlg97" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.073156 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/17dd9ce8-b1ca-4810-85fe-9775919eb4b5-cnibin\") pod \"multus-7mbk6\" (UID: \"17dd9ce8-b1ca-4810-85fe-9775919eb4b5\") " pod="openshift-multus/multus-7mbk6" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.073192 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/17dd9ce8-b1ca-4810-85fe-9775919eb4b5-host-var-lib-cni-bin\") pod \"multus-7mbk6\" (UID: \"17dd9ce8-b1ca-4810-85fe-9775919eb4b5\") " pod="openshift-multus/multus-7mbk6" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.073257 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/17dd9ce8-b1ca-4810-85fe-9775919eb4b5-hostroot\") pod \"multus-7mbk6\" (UID: \"17dd9ce8-b1ca-4810-85fe-9775919eb4b5\") " pod="openshift-multus/multus-7mbk6" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.073316 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/17dd9ce8-b1ca-4810-85fe-9775919eb4b5-multus-conf-dir\") pod \"multus-7mbk6\" (UID: \"17dd9ce8-b1ca-4810-85fe-9775919eb4b5\") " pod="openshift-multus/multus-7mbk6" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.073354 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/90062989-bf1b-4479-89a0-f3bf0d438ac3-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-jlg97\" (UID: \"90062989-bf1b-4479-89a0-f3bf0d438ac3\") " pod="openshift-multus/multus-additional-cni-plugins-jlg97" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.073391 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/90062989-bf1b-4479-89a0-f3bf0d438ac3-tuning-conf-dir\") pod \"multus-additional-cni-plugins-jlg97\" (UID: \"90062989-bf1b-4479-89a0-f3bf0d438ac3\") " pod="openshift-multus/multus-additional-cni-plugins-jlg97" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.073423 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/17dd9ce8-b1ca-4810-85fe-9775919eb4b5-host-run-multus-certs\") pod \"multus-7mbk6\" (UID: \"17dd9ce8-b1ca-4810-85fe-9775919eb4b5\") " pod="openshift-multus/multus-7mbk6" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.073456 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/90062989-bf1b-4479-89a0-f3bf0d438ac3-cni-binary-copy\") pod \"multus-additional-cni-plugins-jlg97\" (UID: \"90062989-bf1b-4479-89a0-f3bf0d438ac3\") " pod="openshift-multus/multus-additional-cni-plugins-jlg97" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.073492 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/17dd9ce8-b1ca-4810-85fe-9775919eb4b5-host-run-netns\") pod \"multus-7mbk6\" (UID: \"17dd9ce8-b1ca-4810-85fe-9775919eb4b5\") " pod="openshift-multus/multus-7mbk6" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.073526 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/17dd9ce8-b1ca-4810-85fe-9775919eb4b5-host-var-lib-cni-multus\") pod \"multus-7mbk6\" (UID: \"17dd9ce8-b1ca-4810-85fe-9775919eb4b5\") " pod="openshift-multus/multus-7mbk6" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.073578 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/90062989-bf1b-4479-89a0-f3bf0d438ac3-os-release\") pod \"multus-additional-cni-plugins-jlg97\" (UID: \"90062989-bf1b-4479-89a0-f3bf0d438ac3\") " pod="openshift-multus/multus-additional-cni-plugins-jlg97" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.073608 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/17dd9ce8-b1ca-4810-85fe-9775919eb4b5-system-cni-dir\") pod \"multus-7mbk6\" (UID: \"17dd9ce8-b1ca-4810-85fe-9775919eb4b5\") " pod="openshift-multus/multus-7mbk6" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.073642 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxqkd\" (UniqueName: \"kubernetes.io/projected/90062989-bf1b-4479-89a0-f3bf0d438ac3-kube-api-access-jxqkd\") pod \"multus-additional-cni-plugins-jlg97\" (UID: \"90062989-bf1b-4479-89a0-f3bf0d438ac3\") " pod="openshift-multus/multus-additional-cni-plugins-jlg97" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.073675 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/17dd9ce8-b1ca-4810-85fe-9775919eb4b5-cni-binary-copy\") pod \"multus-7mbk6\" (UID: \"17dd9ce8-b1ca-4810-85fe-9775919eb4b5\") " pod="openshift-multus/multus-7mbk6" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.073706 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b4ee0bb1-125d-4852-a54d-7dadf6177545-proxy-tls\") pod \"machine-config-daemon-nn578\" (UID: \"b4ee0bb1-125d-4852-a54d-7dadf6177545\") " pod="openshift-machine-config-operator/machine-config-daemon-nn578" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.073743 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/17dd9ce8-b1ca-4810-85fe-9775919eb4b5-multus-cni-dir\") pod \"multus-7mbk6\" (UID: \"17dd9ce8-b1ca-4810-85fe-9775919eb4b5\") " pod="openshift-multus/multus-7mbk6" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.073786 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/17dd9ce8-b1ca-4810-85fe-9775919eb4b5-host-run-k8s-cni-cncf-io\") pod \"multus-7mbk6\" (UID: \"17dd9ce8-b1ca-4810-85fe-9775919eb4b5\") " pod="openshift-multus/multus-7mbk6" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.073820 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgnjd\" (UniqueName: \"kubernetes.io/projected/17dd9ce8-b1ca-4810-85fe-9775919eb4b5-kube-api-access-tgnjd\") pod \"multus-7mbk6\" (UID: \"17dd9ce8-b1ca-4810-85fe-9775919eb4b5\") " pod="openshift-multus/multus-7mbk6" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.073854 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/90062989-bf1b-4479-89a0-f3bf0d438ac3-cnibin\") pod \"multus-additional-cni-plugins-jlg97\" (UID: \"90062989-bf1b-4479-89a0-f3bf0d438ac3\") " pod="openshift-multus/multus-additional-cni-plugins-jlg97" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.073911 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/17dd9ce8-b1ca-4810-85fe-9775919eb4b5-multus-socket-dir-parent\") pod \"multus-7mbk6\" (UID: \"17dd9ce8-b1ca-4810-85fe-9775919eb4b5\") " pod="openshift-multus/multus-7mbk6" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.073943 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/17dd9ce8-b1ca-4810-85fe-9775919eb4b5-host-var-lib-kubelet\") pod \"multus-7mbk6\" (UID: \"17dd9ce8-b1ca-4810-85fe-9775919eb4b5\") " pod="openshift-multus/multus-7mbk6" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.073976 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/b4ee0bb1-125d-4852-a54d-7dadf6177545-rootfs\") pod \"machine-config-daemon-nn578\" (UID: \"b4ee0bb1-125d-4852-a54d-7dadf6177545\") " pod="openshift-machine-config-operator/machine-config-daemon-nn578" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.074009 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b4ee0bb1-125d-4852-a54d-7dadf6177545-mcd-auth-proxy-config\") pod \"machine-config-daemon-nn578\" (UID: \"b4ee0bb1-125d-4852-a54d-7dadf6177545\") " pod="openshift-machine-config-operator/machine-config-daemon-nn578" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.074041 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/17dd9ce8-b1ca-4810-85fe-9775919eb4b5-os-release\") pod \"multus-7mbk6\" (UID: \"17dd9ce8-b1ca-4810-85fe-9775919eb4b5\") " pod="openshift-multus/multus-7mbk6" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.074181 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/17dd9ce8-b1ca-4810-85fe-9775919eb4b5-os-release\") pod \"multus-7mbk6\" (UID: \"17dd9ce8-b1ca-4810-85fe-9775919eb4b5\") " pod="openshift-multus/multus-7mbk6" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.074501 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/17dd9ce8-b1ca-4810-85fe-9775919eb4b5-host-run-multus-certs\") pod \"multus-7mbk6\" (UID: \"17dd9ce8-b1ca-4810-85fe-9775919eb4b5\") " pod="openshift-multus/multus-7mbk6" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.074893 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/17dd9ce8-b1ca-4810-85fe-9775919eb4b5-etc-kubernetes\") pod \"multus-7mbk6\" (UID: \"17dd9ce8-b1ca-4810-85fe-9775919eb4b5\") " pod="openshift-multus/multus-7mbk6" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.074914 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/90062989-bf1b-4479-89a0-f3bf0d438ac3-tuning-conf-dir\") pod \"multus-additional-cni-plugins-jlg97\" (UID: \"90062989-bf1b-4479-89a0-f3bf0d438ac3\") " pod="openshift-multus/multus-additional-cni-plugins-jlg97" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.074902 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/90062989-bf1b-4479-89a0-f3bf0d438ac3-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-jlg97\" (UID: \"90062989-bf1b-4479-89a0-f3bf0d438ac3\") " pod="openshift-multus/multus-additional-cni-plugins-jlg97" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.074919 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/17dd9ce8-b1ca-4810-85fe-9775919eb4b5-host-var-lib-cni-bin\") pod \"multus-7mbk6\" (UID: \"17dd9ce8-b1ca-4810-85fe-9775919eb4b5\") " pod="openshift-multus/multus-7mbk6" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.073447 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/90062989-bf1b-4479-89a0-f3bf0d438ac3-system-cni-dir\") pod \"multus-additional-cni-plugins-jlg97\" (UID: \"90062989-bf1b-4479-89a0-f3bf0d438ac3\") " pod="openshift-multus/multus-additional-cni-plugins-jlg97" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.074990 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/17dd9ce8-b1ca-4810-85fe-9775919eb4b5-cnibin\") pod \"multus-7mbk6\" (UID: \"17dd9ce8-b1ca-4810-85fe-9775919eb4b5\") " pod="openshift-multus/multus-7mbk6" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.075035 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/17dd9ce8-b1ca-4810-85fe-9775919eb4b5-multus-cni-dir\") pod \"multus-7mbk6\" (UID: \"17dd9ce8-b1ca-4810-85fe-9775919eb4b5\") " pod="openshift-multus/multus-7mbk6" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.074992 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/17dd9ce8-b1ca-4810-85fe-9775919eb4b5-hostroot\") pod \"multus-7mbk6\" (UID: \"17dd9ce8-b1ca-4810-85fe-9775919eb4b5\") " pod="openshift-multus/multus-7mbk6" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.075033 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/17dd9ce8-b1ca-4810-85fe-9775919eb4b5-host-run-netns\") pod \"multus-7mbk6\" (UID: \"17dd9ce8-b1ca-4810-85fe-9775919eb4b5\") " pod="openshift-multus/multus-7mbk6" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.075047 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/17dd9ce8-b1ca-4810-85fe-9775919eb4b5-multus-conf-dir\") pod \"multus-7mbk6\" (UID: \"17dd9ce8-b1ca-4810-85fe-9775919eb4b5\") " pod="openshift-multus/multus-7mbk6" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.075073 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/17dd9ce8-b1ca-4810-85fe-9775919eb4b5-host-run-k8s-cni-cncf-io\") pod \"multus-7mbk6\" (UID: \"17dd9ce8-b1ca-4810-85fe-9775919eb4b5\") " pod="openshift-multus/multus-7mbk6" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.075082 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/17dd9ce8-b1ca-4810-85fe-9775919eb4b5-host-var-lib-cni-multus\") pod \"multus-7mbk6\" (UID: \"17dd9ce8-b1ca-4810-85fe-9775919eb4b5\") " pod="openshift-multus/multus-7mbk6" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.075116 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/17dd9ce8-b1ca-4810-85fe-9775919eb4b5-multus-socket-dir-parent\") pod \"multus-7mbk6\" (UID: \"17dd9ce8-b1ca-4810-85fe-9775919eb4b5\") " pod="openshift-multus/multus-7mbk6" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.075124 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/17dd9ce8-b1ca-4810-85fe-9775919eb4b5-host-var-lib-kubelet\") pod \"multus-7mbk6\" (UID: \"17dd9ce8-b1ca-4810-85fe-9775919eb4b5\") " pod="openshift-multus/multus-7mbk6" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.075138 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/90062989-bf1b-4479-89a0-f3bf0d438ac3-os-release\") pod \"multus-additional-cni-plugins-jlg97\" (UID: \"90062989-bf1b-4479-89a0-f3bf0d438ac3\") " pod="openshift-multus/multus-additional-cni-plugins-jlg97" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.075170 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/17dd9ce8-b1ca-4810-85fe-9775919eb4b5-system-cni-dir\") pod \"multus-7mbk6\" (UID: \"17dd9ce8-b1ca-4810-85fe-9775919eb4b5\") " pod="openshift-multus/multus-7mbk6" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.075174 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/b4ee0bb1-125d-4852-a54d-7dadf6177545-rootfs\") pod \"machine-config-daemon-nn578\" (UID: \"b4ee0bb1-125d-4852-a54d-7dadf6177545\") " pod="openshift-machine-config-operator/machine-config-daemon-nn578" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.075244 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/90062989-bf1b-4479-89a0-f3bf0d438ac3-cni-binary-copy\") pod \"multus-additional-cni-plugins-jlg97\" (UID: \"90062989-bf1b-4479-89a0-f3bf0d438ac3\") " pod="openshift-multus/multus-additional-cni-plugins-jlg97" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.075427 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/90062989-bf1b-4479-89a0-f3bf0d438ac3-cnibin\") pod \"multus-additional-cni-plugins-jlg97\" (UID: \"90062989-bf1b-4479-89a0-f3bf0d438ac3\") " pod="openshift-multus/multus-additional-cni-plugins-jlg97" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.076244 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/17dd9ce8-b1ca-4810-85fe-9775919eb4b5-cni-binary-copy\") pod \"multus-7mbk6\" (UID: \"17dd9ce8-b1ca-4810-85fe-9775919eb4b5\") " pod="openshift-multus/multus-7mbk6" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.077542 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/17dd9ce8-b1ca-4810-85fe-9775919eb4b5-multus-daemon-config\") pod \"multus-7mbk6\" (UID: \"17dd9ce8-b1ca-4810-85fe-9775919eb4b5\") " pod="openshift-multus/multus-7mbk6" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.078179 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b4ee0bb1-125d-4852-a54d-7dadf6177545-mcd-auth-proxy-config\") pod \"machine-config-daemon-nn578\" (UID: \"b4ee0bb1-125d-4852-a54d-7dadf6177545\") " pod="openshift-machine-config-operator/machine-config-daemon-nn578" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.090065 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jlg97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90062989-bf1b-4479-89a0-f3bf0d438ac3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jlg97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.092248 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b4ee0bb1-125d-4852-a54d-7dadf6177545-proxy-tls\") pod \"machine-config-daemon-nn578\" (UID: \"b4ee0bb1-125d-4852-a54d-7dadf6177545\") " pod="openshift-machine-config-operator/machine-config-daemon-nn578" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.095855 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgnjd\" (UniqueName: \"kubernetes.io/projected/17dd9ce8-b1ca-4810-85fe-9775919eb4b5-kube-api-access-tgnjd\") pod \"multus-7mbk6\" (UID: \"17dd9ce8-b1ca-4810-85fe-9775919eb4b5\") " pod="openshift-multus/multus-7mbk6" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.098669 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxqkd\" (UniqueName: \"kubernetes.io/projected/90062989-bf1b-4479-89a0-f3bf0d438ac3-kube-api-access-jxqkd\") pod \"multus-additional-cni-plugins-jlg97\" (UID: \"90062989-bf1b-4479-89a0-f3bf0d438ac3\") " pod="openshift-multus/multus-additional-cni-plugins-jlg97" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.100419 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.101612 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9ngd\" (UniqueName: \"kubernetes.io/projected/b4ee0bb1-125d-4852-a54d-7dadf6177545-kube-api-access-m9ngd\") pod \"machine-config-daemon-nn578\" (UID: \"b4ee0bb1-125d-4852-a54d-7dadf6177545\") " pod="openshift-machine-config-operator/machine-config-daemon-nn578" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.112460 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.113038 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.113081 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.113096 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.113115 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.113126 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:52Z","lastTransitionTime":"2026-02-24T10:17:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.123132 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-29rvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9cba56db-d46e-4a34-9863-47e4dce27ca5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fk9xv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-29rvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.134658 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.195649 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-nn578" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.244639 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-7mbk6" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.244715 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-jlg97" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.245334 4698 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-mgh7p"] Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.247069 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.252278 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.252356 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.252375 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.252396 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.252417 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:52Z","lastTransitionTime":"2026-02-24T10:17:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.252771 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.254148 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.254540 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.254705 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.254819 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.254946 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.254947 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.260907 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-29rvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9cba56db-d46e-4a34-9863-47e4dce27ca5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fk9xv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-29rvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.269777 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.275366 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.275474 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/066df704-6981-4770-a647-df52a0da50a0-host-run-netns\") pod \"ovnkube-node-mgh7p\" (UID: \"066df704-6981-4770-a647-df52a0da50a0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.275497 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/066df704-6981-4770-a647-df52a0da50a0-host-cni-bin\") pod \"ovnkube-node-mgh7p\" (UID: \"066df704-6981-4770-a647-df52a0da50a0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.275511 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/066df704-6981-4770-a647-df52a0da50a0-host-cni-netd\") pod \"ovnkube-node-mgh7p\" (UID: \"066df704-6981-4770-a647-df52a0da50a0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.275527 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/066df704-6981-4770-a647-df52a0da50a0-host-slash\") pod \"ovnkube-node-mgh7p\" (UID: \"066df704-6981-4770-a647-df52a0da50a0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.275544 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/066df704-6981-4770-a647-df52a0da50a0-ovnkube-script-lib\") pod \"ovnkube-node-mgh7p\" (UID: \"066df704-6981-4770-a647-df52a0da50a0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.275562 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/066df704-6981-4770-a647-df52a0da50a0-host-kubelet\") pod \"ovnkube-node-mgh7p\" (UID: \"066df704-6981-4770-a647-df52a0da50a0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.275578 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2k2d\" (UniqueName: \"kubernetes.io/projected/066df704-6981-4770-a647-df52a0da50a0-kube-api-access-l2k2d\") pod \"ovnkube-node-mgh7p\" (UID: \"066df704-6981-4770-a647-df52a0da50a0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.275597 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/066df704-6981-4770-a647-df52a0da50a0-run-openvswitch\") pod \"ovnkube-node-mgh7p\" (UID: \"066df704-6981-4770-a647-df52a0da50a0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.275611 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/066df704-6981-4770-a647-df52a0da50a0-log-socket\") pod \"ovnkube-node-mgh7p\" (UID: \"066df704-6981-4770-a647-df52a0da50a0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.275625 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/066df704-6981-4770-a647-df52a0da50a0-run-ovn\") pod \"ovnkube-node-mgh7p\" (UID: \"066df704-6981-4770-a647-df52a0da50a0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.275644 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.275668 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/066df704-6981-4770-a647-df52a0da50a0-etc-openvswitch\") pod \"ovnkube-node-mgh7p\" (UID: \"066df704-6981-4770-a647-df52a0da50a0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.275683 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/066df704-6981-4770-a647-df52a0da50a0-host-run-ovn-kubernetes\") pod \"ovnkube-node-mgh7p\" (UID: \"066df704-6981-4770-a647-df52a0da50a0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.275698 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/066df704-6981-4770-a647-df52a0da50a0-ovnkube-config\") pod \"ovnkube-node-mgh7p\" (UID: \"066df704-6981-4770-a647-df52a0da50a0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.275734 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.275751 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/066df704-6981-4770-a647-df52a0da50a0-node-log\") pod \"ovnkube-node-mgh7p\" (UID: \"066df704-6981-4770-a647-df52a0da50a0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.275768 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/066df704-6981-4770-a647-df52a0da50a0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mgh7p\" (UID: \"066df704-6981-4770-a647-df52a0da50a0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.275784 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/066df704-6981-4770-a647-df52a0da50a0-ovn-node-metrics-cert\") pod \"ovnkube-node-mgh7p\" (UID: \"066df704-6981-4770-a647-df52a0da50a0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.275798 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/066df704-6981-4770-a647-df52a0da50a0-var-lib-openvswitch\") pod \"ovnkube-node-mgh7p\" (UID: \"066df704-6981-4770-a647-df52a0da50a0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.275813 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/066df704-6981-4770-a647-df52a0da50a0-systemd-units\") pod \"ovnkube-node-mgh7p\" (UID: \"066df704-6981-4770-a647-df52a0da50a0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.275837 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/066df704-6981-4770-a647-df52a0da50a0-env-overrides\") pod \"ovnkube-node-mgh7p\" (UID: \"066df704-6981-4770-a647-df52a0da50a0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.275866 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/066df704-6981-4770-a647-df52a0da50a0-run-systemd\") pod \"ovnkube-node-mgh7p\" (UID: \"066df704-6981-4770-a647-df52a0da50a0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" Feb 24 10:17:52 crc kubenswrapper[4698]: E0224 10:17:52.275956 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:17:53.275941433 +0000 UTC m=+98.389555674 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:17:52 crc kubenswrapper[4698]: E0224 10:17:52.276088 4698 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 24 10:17:52 crc kubenswrapper[4698]: E0224 10:17:52.276119 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-24 10:17:53.276112977 +0000 UTC m=+98.389727218 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 24 10:17:52 crc kubenswrapper[4698]: E0224 10:17:52.276171 4698 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 24 10:17:52 crc kubenswrapper[4698]: E0224 10:17:52.276190 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-24 10:17:53.276184819 +0000 UTC m=+98.389799060 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.278359 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-nn578" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4ee0bb1-125d-4852-a54d-7dadf6177545\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9ngd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9ngd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-nn578\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.290777 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7mbk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17dd9ce8-b1ca-4810-85fe-9775919eb4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgnjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7mbk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.302677 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.315528 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.323823 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.334467 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.346171 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jlg97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90062989-bf1b-4479-89a0-f3bf0d438ac3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jlg97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.356834 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.356863 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.356871 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.356888 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.356898 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:52Z","lastTransitionTime":"2026-02-24T10:17:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.364360 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"066df704-6981-4770-a647-df52a0da50a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mgh7p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.372842 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.377389 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/066df704-6981-4770-a647-df52a0da50a0-systemd-units\") pod \"ovnkube-node-mgh7p\" (UID: \"066df704-6981-4770-a647-df52a0da50a0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.377428 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/066df704-6981-4770-a647-df52a0da50a0-env-overrides\") pod \"ovnkube-node-mgh7p\" (UID: \"066df704-6981-4770-a647-df52a0da50a0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.377449 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/066df704-6981-4770-a647-df52a0da50a0-run-systemd\") pod \"ovnkube-node-mgh7p\" (UID: \"066df704-6981-4770-a647-df52a0da50a0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.377475 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.377500 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/066df704-6981-4770-a647-df52a0da50a0-host-run-netns\") pod \"ovnkube-node-mgh7p\" (UID: \"066df704-6981-4770-a647-df52a0da50a0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.377500 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/066df704-6981-4770-a647-df52a0da50a0-systemd-units\") pod \"ovnkube-node-mgh7p\" (UID: \"066df704-6981-4770-a647-df52a0da50a0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.377522 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/066df704-6981-4770-a647-df52a0da50a0-host-cni-bin\") pod \"ovnkube-node-mgh7p\" (UID: \"066df704-6981-4770-a647-df52a0da50a0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.377554 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/066df704-6981-4770-a647-df52a0da50a0-host-cni-netd\") pod \"ovnkube-node-mgh7p\" (UID: \"066df704-6981-4770-a647-df52a0da50a0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.377562 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/066df704-6981-4770-a647-df52a0da50a0-host-run-netns\") pod \"ovnkube-node-mgh7p\" (UID: \"066df704-6981-4770-a647-df52a0da50a0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.377578 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/066df704-6981-4770-a647-df52a0da50a0-host-slash\") pod \"ovnkube-node-mgh7p\" (UID: \"066df704-6981-4770-a647-df52a0da50a0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.377601 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/066df704-6981-4770-a647-df52a0da50a0-ovnkube-script-lib\") pod \"ovnkube-node-mgh7p\" (UID: \"066df704-6981-4770-a647-df52a0da50a0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.377622 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/066df704-6981-4770-a647-df52a0da50a0-host-cni-netd\") pod \"ovnkube-node-mgh7p\" (UID: \"066df704-6981-4770-a647-df52a0da50a0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.377624 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/066df704-6981-4770-a647-df52a0da50a0-host-kubelet\") pod \"ovnkube-node-mgh7p\" (UID: \"066df704-6981-4770-a647-df52a0da50a0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.377652 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/066df704-6981-4770-a647-df52a0da50a0-host-kubelet\") pod \"ovnkube-node-mgh7p\" (UID: \"066df704-6981-4770-a647-df52a0da50a0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.377653 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/066df704-6981-4770-a647-df52a0da50a0-host-cni-bin\") pod \"ovnkube-node-mgh7p\" (UID: \"066df704-6981-4770-a647-df52a0da50a0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" Feb 24 10:17:52 crc kubenswrapper[4698]: E0224 10:17:52.377678 4698 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 24 10:17:52 crc kubenswrapper[4698]: E0224 10:17:52.377705 4698 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.377662 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2k2d\" (UniqueName: \"kubernetes.io/projected/066df704-6981-4770-a647-df52a0da50a0-kube-api-access-l2k2d\") pod \"ovnkube-node-mgh7p\" (UID: \"066df704-6981-4770-a647-df52a0da50a0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" Feb 24 10:17:52 crc kubenswrapper[4698]: E0224 10:17:52.377717 4698 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.377751 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/066df704-6981-4770-a647-df52a0da50a0-run-openvswitch\") pod \"ovnkube-node-mgh7p\" (UID: \"066df704-6981-4770-a647-df52a0da50a0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" Feb 24 10:17:52 crc kubenswrapper[4698]: E0224 10:17:52.377766 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-24 10:17:53.377747409 +0000 UTC m=+98.491361650 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.377792 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/066df704-6981-4770-a647-df52a0da50a0-run-openvswitch\") pod \"ovnkube-node-mgh7p\" (UID: \"066df704-6981-4770-a647-df52a0da50a0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.377795 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/066df704-6981-4770-a647-df52a0da50a0-log-socket\") pod \"ovnkube-node-mgh7p\" (UID: \"066df704-6981-4770-a647-df52a0da50a0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.377838 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/066df704-6981-4770-a647-df52a0da50a0-run-ovn\") pod \"ovnkube-node-mgh7p\" (UID: \"066df704-6981-4770-a647-df52a0da50a0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.377871 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.377891 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/066df704-6981-4770-a647-df52a0da50a0-etc-openvswitch\") pod \"ovnkube-node-mgh7p\" (UID: \"066df704-6981-4770-a647-df52a0da50a0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.377908 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/066df704-6981-4770-a647-df52a0da50a0-host-run-ovn-kubernetes\") pod \"ovnkube-node-mgh7p\" (UID: \"066df704-6981-4770-a647-df52a0da50a0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.377933 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/066df704-6981-4770-a647-df52a0da50a0-node-log\") pod \"ovnkube-node-mgh7p\" (UID: \"066df704-6981-4770-a647-df52a0da50a0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.377950 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/066df704-6981-4770-a647-df52a0da50a0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mgh7p\" (UID: \"066df704-6981-4770-a647-df52a0da50a0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.377967 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/066df704-6981-4770-a647-df52a0da50a0-ovnkube-config\") pod \"ovnkube-node-mgh7p\" (UID: \"066df704-6981-4770-a647-df52a0da50a0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.377686 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/066df704-6981-4770-a647-df52a0da50a0-host-slash\") pod \"ovnkube-node-mgh7p\" (UID: \"066df704-6981-4770-a647-df52a0da50a0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.377976 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/066df704-6981-4770-a647-df52a0da50a0-env-overrides\") pod \"ovnkube-node-mgh7p\" (UID: \"066df704-6981-4770-a647-df52a0da50a0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.377989 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/066df704-6981-4770-a647-df52a0da50a0-ovn-node-metrics-cert\") pod \"ovnkube-node-mgh7p\" (UID: \"066df704-6981-4770-a647-df52a0da50a0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.378009 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/066df704-6981-4770-a647-df52a0da50a0-var-lib-openvswitch\") pod \"ovnkube-node-mgh7p\" (UID: \"066df704-6981-4770-a647-df52a0da50a0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.378012 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/066df704-6981-4770-a647-df52a0da50a0-log-socket\") pod \"ovnkube-node-mgh7p\" (UID: \"066df704-6981-4770-a647-df52a0da50a0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.378017 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/066df704-6981-4770-a647-df52a0da50a0-node-log\") pod \"ovnkube-node-mgh7p\" (UID: \"066df704-6981-4770-a647-df52a0da50a0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" Feb 24 10:17:52 crc kubenswrapper[4698]: E0224 10:17:52.378057 4698 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 24 10:17:52 crc kubenswrapper[4698]: E0224 10:17:52.378099 4698 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.378111 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/066df704-6981-4770-a647-df52a0da50a0-host-run-ovn-kubernetes\") pod \"ovnkube-node-mgh7p\" (UID: \"066df704-6981-4770-a647-df52a0da50a0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" Feb 24 10:17:52 crc kubenswrapper[4698]: E0224 10:17:52.378113 4698 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.378098 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/066df704-6981-4770-a647-df52a0da50a0-etc-openvswitch\") pod \"ovnkube-node-mgh7p\" (UID: \"066df704-6981-4770-a647-df52a0da50a0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.378074 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/066df704-6981-4770-a647-df52a0da50a0-var-lib-openvswitch\") pod \"ovnkube-node-mgh7p\" (UID: \"066df704-6981-4770-a647-df52a0da50a0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.378147 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/066df704-6981-4770-a647-df52a0da50a0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mgh7p\" (UID: \"066df704-6981-4770-a647-df52a0da50a0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" Feb 24 10:17:52 crc kubenswrapper[4698]: E0224 10:17:52.378213 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-24 10:17:53.37819787 +0000 UTC m=+98.491812121 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.378229 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/066df704-6981-4770-a647-df52a0da50a0-run-ovn\") pod \"ovnkube-node-mgh7p\" (UID: \"066df704-6981-4770-a647-df52a0da50a0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.378280 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/066df704-6981-4770-a647-df52a0da50a0-run-systemd\") pod \"ovnkube-node-mgh7p\" (UID: \"066df704-6981-4770-a647-df52a0da50a0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.378490 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/066df704-6981-4770-a647-df52a0da50a0-ovnkube-config\") pod \"ovnkube-node-mgh7p\" (UID: \"066df704-6981-4770-a647-df52a0da50a0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.378590 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/066df704-6981-4770-a647-df52a0da50a0-ovnkube-script-lib\") pod \"ovnkube-node-mgh7p\" (UID: \"066df704-6981-4770-a647-df52a0da50a0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.381215 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/066df704-6981-4770-a647-df52a0da50a0-ovn-node-metrics-cert\") pod \"ovnkube-node-mgh7p\" (UID: \"066df704-6981-4770-a647-df52a0da50a0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.393362 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2k2d\" (UniqueName: \"kubernetes.io/projected/066df704-6981-4770-a647-df52a0da50a0-kube-api-access-l2k2d\") pod \"ovnkube-node-mgh7p\" (UID: \"066df704-6981-4770-a647-df52a0da50a0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.459187 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.459222 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.459232 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.459249 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.459284 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:52Z","lastTransitionTime":"2026-02-24T10:17:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.562480 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.562529 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.562546 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.562573 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.562590 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:52Z","lastTransitionTime":"2026-02-24T10:17:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.573182 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" Feb 24 10:17:52 crc kubenswrapper[4698]: W0224 10:17:52.590696 4698 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod066df704_6981_4770_a647_df52a0da50a0.slice/crio-221176da06c75722a417e733f5f7886a0da7218159233b2983b544c5fbd562d4 WatchSource:0}: Error finding container 221176da06c75722a417e733f5f7886a0da7218159233b2983b544c5fbd562d4: Status 404 returned error can't find the container with id 221176da06c75722a417e733f5f7886a0da7218159233b2983b544c5fbd562d4 Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.614468 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:17:52 crc kubenswrapper[4698]: E0224 10:17:52.614591 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.666051 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.666117 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.666133 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.666158 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.666176 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:52Z","lastTransitionTime":"2026-02-24T10:17:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.769477 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.769517 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.769529 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.769545 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.769555 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:52Z","lastTransitionTime":"2026-02-24T10:17:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.871854 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.871892 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.871903 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.871920 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.871930 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:52Z","lastTransitionTime":"2026-02-24T10:17:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.974113 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.974174 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.974192 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.974217 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:52 crc kubenswrapper[4698]: I0224 10:17:52.974234 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:52Z","lastTransitionTime":"2026-02-24T10:17:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.019060 4698 generic.go:334] "Generic (PLEG): container finished" podID="90062989-bf1b-4479-89a0-f3bf0d438ac3" containerID="f570e20898252544de2e4987e3ec3baea2d46904749fc01664c969518d8babd6" exitCode=0 Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.019147 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jlg97" event={"ID":"90062989-bf1b-4479-89a0-f3bf0d438ac3","Type":"ContainerDied","Data":"f570e20898252544de2e4987e3ec3baea2d46904749fc01664c969518d8babd6"} Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.019217 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jlg97" event={"ID":"90062989-bf1b-4479-89a0-f3bf0d438ac3","Type":"ContainerStarted","Data":"be9bd4ed3ed5ed8c0c33f2867f4d383993f0bd3e0ede32aa6790bdcd5a8eac01"} Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.025143 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"1e70623bb6b1c9ba54ae662592cd2861cea4181853f6595a595390c81820c287"} Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.025183 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"863cee3a2b2acf3e3138d4e13d27a2b4229d619661f97eab920e5a4ee7ae2c51"} Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.028311 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"4539d49e9935099b59be97e672ffbe6a2a831b9261939a5afba45e16aab5c2dd"} Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.039531 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:53Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.040171 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-29rvz" event={"ID":"9cba56db-d46e-4a34-9863-47e4dce27ca5","Type":"ContainerStarted","Data":"f62a06c2933f02c75637172be87adadd015a2aad2750f553bb2e99c38fbec74b"} Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.040237 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-29rvz" event={"ID":"9cba56db-d46e-4a34-9863-47e4dce27ca5","Type":"ContainerStarted","Data":"36dfb0928f1c9baae553b6223d27b31b4206a41bc489a7621a1ed495c81b490f"} Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.044694 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7mbk6" event={"ID":"17dd9ce8-b1ca-4810-85fe-9775919eb4b5","Type":"ContainerStarted","Data":"ac059400b5a17e1f1dc36d2fe35b5c8ace2dad5326f3933873eae644e1786c54"} Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.044756 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7mbk6" event={"ID":"17dd9ce8-b1ca-4810-85fe-9775919eb4b5","Type":"ContainerStarted","Data":"9971a15139c445f90d7173d7ab08d0f70a11819d9cb3f719c866534d42880042"} Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.058921 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-nn578" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4ee0bb1-125d-4852-a54d-7dadf6177545\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9ngd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9ngd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-nn578\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:53Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.059001 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-nn578" event={"ID":"b4ee0bb1-125d-4852-a54d-7dadf6177545","Type":"ContainerStarted","Data":"e67e08c23594b195088f0a11823556880d9f809097ec231acf6c4ddbcf5c085b"} Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.059076 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-nn578" event={"ID":"b4ee0bb1-125d-4852-a54d-7dadf6177545","Type":"ContainerStarted","Data":"a0c8bc2bc5ebfb2472863808bf33f95f8aa74ed45b546ed1a1b3be4883af700e"} Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.059103 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-nn578" event={"ID":"b4ee0bb1-125d-4852-a54d-7dadf6177545","Type":"ContainerStarted","Data":"3b51eeaff016c75ad96eb7f810dc04567b52f1dd72d925e8e1b75f47800fb528"} Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.064571 4698 generic.go:334] "Generic (PLEG): container finished" podID="066df704-6981-4770-a647-df52a0da50a0" containerID="363eade2263b2108feaaf0620f7f1fd910effb90ce635e5b749b59b407618443" exitCode=0 Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.064606 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" event={"ID":"066df704-6981-4770-a647-df52a0da50a0","Type":"ContainerDied","Data":"363eade2263b2108feaaf0620f7f1fd910effb90ce635e5b749b59b407618443"} Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.064625 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" event={"ID":"066df704-6981-4770-a647-df52a0da50a0","Type":"ContainerStarted","Data":"221176da06c75722a417e733f5f7886a0da7218159233b2983b544c5fbd562d4"} Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.074842 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7mbk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17dd9ce8-b1ca-4810-85fe-9775919eb4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgnjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7mbk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:53Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.075992 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.076013 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.076022 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.076036 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.076045 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:53Z","lastTransitionTime":"2026-02-24T10:17:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.093759 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:53Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.117792 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:53Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.132321 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:53Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.146091 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:53Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.162528 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jlg97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90062989-bf1b-4479-89a0-f3bf0d438ac3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f570e20898252544de2e4987e3ec3baea2d46904749fc01664c969518d8babd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f570e20898252544de2e4987e3ec3baea2d46904749fc01664c969518d8babd6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jlg97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:53Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.179062 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.179115 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.179128 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.179151 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.179168 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:53Z","lastTransitionTime":"2026-02-24T10:17:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.184810 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"066df704-6981-4770-a647-df52a0da50a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mgh7p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:53Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.199249 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:53Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.211839 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-29rvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9cba56db-d46e-4a34-9863-47e4dce27ca5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fk9xv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-29rvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:53Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.237623 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"066df704-6981-4770-a647-df52a0da50a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://363eade2263b2108feaaf0620f7f1fd910effb90ce635e5b749b59b407618443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://363eade2263b2108feaaf0620f7f1fd910effb90ce635e5b749b59b407618443\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mgh7p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:53Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.257695 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:53Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.277136 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e70623bb6b1c9ba54ae662592cd2861cea4181853f6595a595390c81820c287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://863cee3a2b2acf3e3138d4e13d27a2b4229d619661f97eab920e5a4ee7ae2c51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:53Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.282379 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.282424 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.282441 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.282464 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.282481 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:53Z","lastTransitionTime":"2026-02-24T10:17:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.285407 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.285556 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.285636 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:17:53 crc kubenswrapper[4698]: E0224 10:17:53.285747 4698 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 24 10:17:53 crc kubenswrapper[4698]: E0224 10:17:53.285817 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-24 10:17:55.285796494 +0000 UTC m=+100.399410775 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 24 10:17:53 crc kubenswrapper[4698]: E0224 10:17:53.286207 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:17:55.286190104 +0000 UTC m=+100.399804385 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:17:53 crc kubenswrapper[4698]: E0224 10:17:53.286342 4698 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 24 10:17:53 crc kubenswrapper[4698]: E0224 10:17:53.286397 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-24 10:17:55.286384078 +0000 UTC m=+100.399998359 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.330207 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:53Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.353101 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jlg97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90062989-bf1b-4479-89a0-f3bf0d438ac3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f570e20898252544de2e4987e3ec3baea2d46904749fc01664c969518d8babd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f570e20898252544de2e4987e3ec3baea2d46904749fc01664c969518d8babd6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jlg97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:53Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.376694 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-29rvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9cba56db-d46e-4a34-9863-47e4dce27ca5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f62a06c2933f02c75637172be87adadd015a2aad2750f553bb2e99c38fbec74b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fk9xv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-29rvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:53Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.383829 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.383854 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.383862 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.383875 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.383883 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:53Z","lastTransitionTime":"2026-02-24T10:17:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.386595 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.386637 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:17:53 crc kubenswrapper[4698]: E0224 10:17:53.386733 4698 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 24 10:17:53 crc kubenswrapper[4698]: E0224 10:17:53.386750 4698 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 24 10:17:53 crc kubenswrapper[4698]: E0224 10:17:53.386761 4698 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 10:17:53 crc kubenswrapper[4698]: E0224 10:17:53.386797 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-24 10:17:55.38678448 +0000 UTC m=+100.500398721 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 10:17:53 crc kubenswrapper[4698]: E0224 10:17:53.387009 4698 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 24 10:17:53 crc kubenswrapper[4698]: E0224 10:17:53.387027 4698 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 24 10:17:53 crc kubenswrapper[4698]: E0224 10:17:53.387035 4698 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 10:17:53 crc kubenswrapper[4698]: E0224 10:17:53.387058 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-24 10:17:55.387051007 +0000 UTC m=+100.500665248 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.390703 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:53Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.406162 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:53Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.421490 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4539d49e9935099b59be97e672ffbe6a2a831b9261939a5afba45e16aab5c2dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:53Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.433986 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-nn578" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4ee0bb1-125d-4852-a54d-7dadf6177545\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67e08c23594b195088f0a11823556880d9f809097ec231acf6c4ddbcf5c085b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9ngd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0c8bc2bc5ebfb2472863808bf33f95f8aa74ed45b546ed1a1b3be4883af700e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9ngd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-nn578\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:53Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.448662 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7mbk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17dd9ce8-b1ca-4810-85fe-9775919eb4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac059400b5a17e1f1dc36d2fe35b5c8ace2dad5326f3933873eae644e1786c54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgnjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7mbk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:53Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.486204 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.486242 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.486252 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.486279 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.486288 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:53Z","lastTransitionTime":"2026-02-24T10:17:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.588369 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.588409 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.588421 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.588436 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.588446 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:53Z","lastTransitionTime":"2026-02-24T10:17:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.614787 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.614864 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:17:53 crc kubenswrapper[4698]: E0224 10:17:53.615081 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:17:53 crc kubenswrapper[4698]: E0224 10:17:53.615000 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.618916 4698 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.619628 4698 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.620772 4698 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.621470 4698 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.622435 4698 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.622938 4698 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.623530 4698 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.624487 4698 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.625122 4698 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.626004 4698 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.626518 4698 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.627530 4698 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.628036 4698 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.628535 4698 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.629441 4698 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.629920 4698 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.630820 4698 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.631213 4698 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.631753 4698 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.632764 4698 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.633201 4698 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.634102 4698 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.634522 4698 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.635467 4698 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.636095 4698 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.636817 4698 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.637801 4698 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.638236 4698 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.639181 4698 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.639624 4698 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.640571 4698 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.640665 4698 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.642183 4698 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.643042 4698 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.643504 4698 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.644974 4698 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.645598 4698 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.646451 4698 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.647045 4698 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.648024 4698 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.648501 4698 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.649403 4698 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.650006 4698 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.651064 4698 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.651508 4698 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.652436 4698 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.652937 4698 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.653993 4698 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.654529 4698 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.655394 4698 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.655827 4698 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.658141 4698 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.659497 4698 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.662420 4698 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.691208 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.691237 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.691248 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.691284 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.691296 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:53Z","lastTransitionTime":"2026-02-24T10:17:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.794314 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.794350 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.794359 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.794376 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.794386 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:53Z","lastTransitionTime":"2026-02-24T10:17:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.896777 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.896811 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.896820 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.896833 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.896841 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:53Z","lastTransitionTime":"2026-02-24T10:17:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.999667 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.999715 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.999728 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.999746 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:53 crc kubenswrapper[4698]: I0224 10:17:53.999759 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:53Z","lastTransitionTime":"2026-02-24T10:17:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:54 crc kubenswrapper[4698]: I0224 10:17:54.071807 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" event={"ID":"066df704-6981-4770-a647-df52a0da50a0","Type":"ContainerStarted","Data":"444da705b890c795bca82d2bd44ad5b71ed9bcc95a70ee5c92755679af31aa99"} Feb 24 10:17:54 crc kubenswrapper[4698]: I0224 10:17:54.071876 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" event={"ID":"066df704-6981-4770-a647-df52a0da50a0","Type":"ContainerStarted","Data":"096010abeb5f4fc1cf8ab2a1a3e50000365a449d0747081df923bde1be7e1213"} Feb 24 10:17:54 crc kubenswrapper[4698]: I0224 10:17:54.075771 4698 generic.go:334] "Generic (PLEG): container finished" podID="90062989-bf1b-4479-89a0-f3bf0d438ac3" containerID="86844171c4cdeecffa4831f9bba9b6d9c5eecbcc2220f880ccdb8819df60fa34" exitCode=0 Feb 24 10:17:54 crc kubenswrapper[4698]: I0224 10:17:54.075812 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jlg97" event={"ID":"90062989-bf1b-4479-89a0-f3bf0d438ac3","Type":"ContainerDied","Data":"86844171c4cdeecffa4831f9bba9b6d9c5eecbcc2220f880ccdb8819df60fa34"} Feb 24 10:17:54 crc kubenswrapper[4698]: I0224 10:17:54.103461 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:54Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:54 crc kubenswrapper[4698]: I0224 10:17:54.103722 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:54 crc kubenswrapper[4698]: I0224 10:17:54.104084 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:54 crc kubenswrapper[4698]: I0224 10:17:54.104104 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:54 crc kubenswrapper[4698]: I0224 10:17:54.104477 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:54 crc kubenswrapper[4698]: I0224 10:17:54.104745 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:54Z","lastTransitionTime":"2026-02-24T10:17:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:54 crc kubenswrapper[4698]: I0224 10:17:54.127617 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e70623bb6b1c9ba54ae662592cd2861cea4181853f6595a595390c81820c287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://863cee3a2b2acf3e3138d4e13d27a2b4229d619661f97eab920e5a4ee7ae2c51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:54Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:54 crc kubenswrapper[4698]: I0224 10:17:54.154803 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:54Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:54 crc kubenswrapper[4698]: I0224 10:17:54.174607 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jlg97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90062989-bf1b-4479-89a0-f3bf0d438ac3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f570e20898252544de2e4987e3ec3baea2d46904749fc01664c969518d8babd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f570e20898252544de2e4987e3ec3baea2d46904749fc01664c969518d8babd6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86844171c4cdeecffa4831f9bba9b6d9c5eecbcc2220f880ccdb8819df60fa34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86844171c4cdeecffa4831f9bba9b6d9c5eecbcc2220f880ccdb8819df60fa34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jlg97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:54Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:54 crc kubenswrapper[4698]: I0224 10:17:54.206703 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:54 crc kubenswrapper[4698]: I0224 10:17:54.206736 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:54 crc kubenswrapper[4698]: I0224 10:17:54.206746 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:54 crc kubenswrapper[4698]: I0224 10:17:54.206769 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:54 crc kubenswrapper[4698]: I0224 10:17:54.206781 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:54Z","lastTransitionTime":"2026-02-24T10:17:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:54 crc kubenswrapper[4698]: I0224 10:17:54.206850 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"066df704-6981-4770-a647-df52a0da50a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://363eade2263b2108feaaf0620f7f1fd910effb90ce635e5b749b59b407618443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://363eade2263b2108feaaf0620f7f1fd910effb90ce635e5b749b59b407618443\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mgh7p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:54Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:54 crc kubenswrapper[4698]: I0224 10:17:54.220609 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-29rvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9cba56db-d46e-4a34-9863-47e4dce27ca5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f62a06c2933f02c75637172be87adadd015a2aad2750f553bb2e99c38fbec74b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fk9xv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-29rvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:54Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:54 crc kubenswrapper[4698]: I0224 10:17:54.235001 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:54Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:54 crc kubenswrapper[4698]: I0224 10:17:54.250924 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4539d49e9935099b59be97e672ffbe6a2a831b9261939a5afba45e16aab5c2dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:54Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:54 crc kubenswrapper[4698]: I0224 10:17:54.265163 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-nn578" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4ee0bb1-125d-4852-a54d-7dadf6177545\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67e08c23594b195088f0a11823556880d9f809097ec231acf6c4ddbcf5c085b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9ngd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0c8bc2bc5ebfb2472863808bf33f95f8aa74ed45b546ed1a1b3be4883af700e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9ngd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-nn578\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:54Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:54 crc kubenswrapper[4698]: I0224 10:17:54.284648 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7mbk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17dd9ce8-b1ca-4810-85fe-9775919eb4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac059400b5a17e1f1dc36d2fe35b5c8ace2dad5326f3933873eae644e1786c54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgnjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7mbk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:54Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:54 crc kubenswrapper[4698]: I0224 10:17:54.301732 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:54Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:54 crc kubenswrapper[4698]: I0224 10:17:54.308956 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:54 crc kubenswrapper[4698]: I0224 10:17:54.308988 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:54 crc kubenswrapper[4698]: I0224 10:17:54.308999 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:54 crc kubenswrapper[4698]: I0224 10:17:54.309014 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:54 crc kubenswrapper[4698]: I0224 10:17:54.309026 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:54Z","lastTransitionTime":"2026-02-24T10:17:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:54 crc kubenswrapper[4698]: I0224 10:17:54.414326 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:54 crc kubenswrapper[4698]: I0224 10:17:54.414374 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:54 crc kubenswrapper[4698]: I0224 10:17:54.414403 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:54 crc kubenswrapper[4698]: I0224 10:17:54.414428 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:54 crc kubenswrapper[4698]: I0224 10:17:54.414445 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:54Z","lastTransitionTime":"2026-02-24T10:17:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:54 crc kubenswrapper[4698]: I0224 10:17:54.517232 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:54 crc kubenswrapper[4698]: I0224 10:17:54.517643 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:54 crc kubenswrapper[4698]: I0224 10:17:54.517656 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:54 crc kubenswrapper[4698]: I0224 10:17:54.517673 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:54 crc kubenswrapper[4698]: I0224 10:17:54.517682 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:54Z","lastTransitionTime":"2026-02-24T10:17:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:54 crc kubenswrapper[4698]: I0224 10:17:54.614466 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:17:54 crc kubenswrapper[4698]: E0224 10:17:54.614684 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:17:54 crc kubenswrapper[4698]: I0224 10:17:54.620573 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:54 crc kubenswrapper[4698]: I0224 10:17:54.620624 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:54 crc kubenswrapper[4698]: I0224 10:17:54.620641 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:54 crc kubenswrapper[4698]: I0224 10:17:54.620663 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:54 crc kubenswrapper[4698]: I0224 10:17:54.620680 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:54Z","lastTransitionTime":"2026-02-24T10:17:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:54 crc kubenswrapper[4698]: I0224 10:17:54.723876 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:54 crc kubenswrapper[4698]: I0224 10:17:54.723932 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:54 crc kubenswrapper[4698]: I0224 10:17:54.723955 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:54 crc kubenswrapper[4698]: I0224 10:17:54.723980 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:54 crc kubenswrapper[4698]: I0224 10:17:54.724001 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:54Z","lastTransitionTime":"2026-02-24T10:17:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:54 crc kubenswrapper[4698]: I0224 10:17:54.828255 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:54 crc kubenswrapper[4698]: I0224 10:17:54.828358 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:54 crc kubenswrapper[4698]: I0224 10:17:54.828382 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:54 crc kubenswrapper[4698]: I0224 10:17:54.828413 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:54 crc kubenswrapper[4698]: I0224 10:17:54.828435 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:54Z","lastTransitionTime":"2026-02-24T10:17:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:54 crc kubenswrapper[4698]: I0224 10:17:54.932161 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:54 crc kubenswrapper[4698]: I0224 10:17:54.932221 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:54 crc kubenswrapper[4698]: I0224 10:17:54.932242 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:54 crc kubenswrapper[4698]: I0224 10:17:54.932295 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:54 crc kubenswrapper[4698]: I0224 10:17:54.932314 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:54Z","lastTransitionTime":"2026-02-24T10:17:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:55 crc kubenswrapper[4698]: I0224 10:17:55.035874 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:55 crc kubenswrapper[4698]: I0224 10:17:55.035930 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:55 crc kubenswrapper[4698]: I0224 10:17:55.035949 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:55 crc kubenswrapper[4698]: I0224 10:17:55.035972 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:55 crc kubenswrapper[4698]: I0224 10:17:55.035989 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:55Z","lastTransitionTime":"2026-02-24T10:17:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:55 crc kubenswrapper[4698]: I0224 10:17:55.086304 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" event={"ID":"066df704-6981-4770-a647-df52a0da50a0","Type":"ContainerStarted","Data":"7adc5b73bdd01b1e822308534c8848e154a1d05ed5367b971b59a99289387585"} Feb 24 10:17:55 crc kubenswrapper[4698]: I0224 10:17:55.086367 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" event={"ID":"066df704-6981-4770-a647-df52a0da50a0","Type":"ContainerStarted","Data":"f2ec337c851d86c491d1ae5a667e4344ae4759f945b423d3a48838874a6eda20"} Feb 24 10:17:55 crc kubenswrapper[4698]: I0224 10:17:55.086402 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" event={"ID":"066df704-6981-4770-a647-df52a0da50a0","Type":"ContainerStarted","Data":"5e27ae8c6aa803d58f6ff0252273d2fcbbee794c49a13fc54bfe6677b5aa6e07"} Feb 24 10:17:55 crc kubenswrapper[4698]: I0224 10:17:55.086422 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" event={"ID":"066df704-6981-4770-a647-df52a0da50a0","Type":"ContainerStarted","Data":"60215d9a7dc3fbaa1b045a76c018c910f3748c5bef5325716e0a28844bc91ece"} Feb 24 10:17:55 crc kubenswrapper[4698]: I0224 10:17:55.090147 4698 generic.go:334] "Generic (PLEG): container finished" podID="90062989-bf1b-4479-89a0-f3bf0d438ac3" containerID="42705a048e7832b1de855a97691620e572a7a7f38b90148e1cedd49003c649fa" exitCode=0 Feb 24 10:17:55 crc kubenswrapper[4698]: I0224 10:17:55.090196 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jlg97" event={"ID":"90062989-bf1b-4479-89a0-f3bf0d438ac3","Type":"ContainerDied","Data":"42705a048e7832b1de855a97691620e572a7a7f38b90148e1cedd49003c649fa"} Feb 24 10:17:55 crc kubenswrapper[4698]: I0224 10:17:55.116741 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e70623bb6b1c9ba54ae662592cd2861cea4181853f6595a595390c81820c287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://863cee3a2b2acf3e3138d4e13d27a2b4229d619661f97eab920e5a4ee7ae2c51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:55Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:55 crc kubenswrapper[4698]: I0224 10:17:55.139418 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:55Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:55 crc kubenswrapper[4698]: I0224 10:17:55.140457 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:55 crc kubenswrapper[4698]: I0224 10:17:55.140537 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:55 crc kubenswrapper[4698]: I0224 10:17:55.140563 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:55 crc kubenswrapper[4698]: I0224 10:17:55.140596 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:55 crc kubenswrapper[4698]: I0224 10:17:55.140621 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:55Z","lastTransitionTime":"2026-02-24T10:17:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:55 crc kubenswrapper[4698]: I0224 10:17:55.163960 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jlg97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90062989-bf1b-4479-89a0-f3bf0d438ac3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f570e20898252544de2e4987e3ec3baea2d46904749fc01664c969518d8babd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f570e20898252544de2e4987e3ec3baea2d46904749fc01664c969518d8babd6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86844171c4cdeecffa4831f9bba9b6d9c5eecbcc2220f880ccdb8819df60fa34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86844171c4cdeecffa4831f9bba9b6d9c5eecbcc2220f880ccdb8819df60fa34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42705a048e7832b1de855a97691620e572a7a7f38b90148e1cedd49003c649fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42705a048e7832b1de855a97691620e572a7a7f38b90148e1cedd49003c649fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jlg97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:55Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:55 crc kubenswrapper[4698]: I0224 10:17:55.201345 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"066df704-6981-4770-a647-df52a0da50a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://363eade2263b2108feaaf0620f7f1fd910effb90ce635e5b749b59b407618443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://363eade2263b2108feaaf0620f7f1fd910effb90ce635e5b749b59b407618443\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mgh7p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:55Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:55 crc kubenswrapper[4698]: I0224 10:17:55.223551 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:55Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:55 crc kubenswrapper[4698]: I0224 10:17:55.241356 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-29rvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9cba56db-d46e-4a34-9863-47e4dce27ca5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f62a06c2933f02c75637172be87adadd015a2aad2750f553bb2e99c38fbec74b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fk9xv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-29rvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:55Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:55 crc kubenswrapper[4698]: I0224 10:17:55.243665 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:55 crc kubenswrapper[4698]: I0224 10:17:55.243714 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:55 crc kubenswrapper[4698]: I0224 10:17:55.243727 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:55 crc kubenswrapper[4698]: I0224 10:17:55.243747 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:55 crc kubenswrapper[4698]: I0224 10:17:55.243764 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:55Z","lastTransitionTime":"2026-02-24T10:17:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:55 crc kubenswrapper[4698]: I0224 10:17:55.261656 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:55Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:55 crc kubenswrapper[4698]: I0224 10:17:55.280390 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-nn578" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4ee0bb1-125d-4852-a54d-7dadf6177545\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67e08c23594b195088f0a11823556880d9f809097ec231acf6c4ddbcf5c085b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9ngd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0c8bc2bc5ebfb2472863808bf33f95f8aa74ed45b546ed1a1b3be4883af700e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9ngd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-nn578\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:55Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:55 crc kubenswrapper[4698]: I0224 10:17:55.296778 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7mbk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17dd9ce8-b1ca-4810-85fe-9775919eb4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac059400b5a17e1f1dc36d2fe35b5c8ace2dad5326f3933873eae644e1786c54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgnjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7mbk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:55Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:55 crc kubenswrapper[4698]: I0224 10:17:55.316589 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:17:55 crc kubenswrapper[4698]: I0224 10:17:55.316782 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:17:55 crc kubenswrapper[4698]: E0224 10:17:55.316863 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:17:59.316822882 +0000 UTC m=+104.430437163 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:17:55 crc kubenswrapper[4698]: I0224 10:17:55.316929 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:17:55 crc kubenswrapper[4698]: E0224 10:17:55.316947 4698 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 24 10:17:55 crc kubenswrapper[4698]: E0224 10:17:55.317020 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-24 10:17:59.316999726 +0000 UTC m=+104.430613997 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 24 10:17:55 crc kubenswrapper[4698]: E0224 10:17:55.317093 4698 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 24 10:17:55 crc kubenswrapper[4698]: E0224 10:17:55.317150 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-24 10:17:59.317137109 +0000 UTC m=+104.430751390 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 24 10:17:55 crc kubenswrapper[4698]: I0224 10:17:55.318538 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:55Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:55 crc kubenswrapper[4698]: I0224 10:17:55.332388 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4539d49e9935099b59be97e672ffbe6a2a831b9261939a5afba45e16aab5c2dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:55Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:55 crc kubenswrapper[4698]: I0224 10:17:55.346438 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:55 crc kubenswrapper[4698]: I0224 10:17:55.346468 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:55 crc kubenswrapper[4698]: I0224 10:17:55.346478 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:55 crc kubenswrapper[4698]: I0224 10:17:55.346494 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:55 crc kubenswrapper[4698]: I0224 10:17:55.346504 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:55Z","lastTransitionTime":"2026-02-24T10:17:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:55 crc kubenswrapper[4698]: I0224 10:17:55.417724 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:17:55 crc kubenswrapper[4698]: I0224 10:17:55.417784 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:17:55 crc kubenswrapper[4698]: E0224 10:17:55.417902 4698 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 24 10:17:55 crc kubenswrapper[4698]: E0224 10:17:55.417917 4698 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 24 10:17:55 crc kubenswrapper[4698]: E0224 10:17:55.417928 4698 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 10:17:55 crc kubenswrapper[4698]: E0224 10:17:55.417970 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-24 10:17:59.417956351 +0000 UTC m=+104.531570592 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 10:17:55 crc kubenswrapper[4698]: E0224 10:17:55.418016 4698 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 24 10:17:55 crc kubenswrapper[4698]: E0224 10:17:55.418025 4698 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 24 10:17:55 crc kubenswrapper[4698]: E0224 10:17:55.418033 4698 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 10:17:55 crc kubenswrapper[4698]: E0224 10:17:55.418052 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-24 10:17:59.418045865 +0000 UTC m=+104.531660106 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 10:17:55 crc kubenswrapper[4698]: I0224 10:17:55.448626 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:55 crc kubenswrapper[4698]: I0224 10:17:55.448666 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:55 crc kubenswrapper[4698]: I0224 10:17:55.448674 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:55 crc kubenswrapper[4698]: I0224 10:17:55.448686 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:55 crc kubenswrapper[4698]: I0224 10:17:55.448696 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:55Z","lastTransitionTime":"2026-02-24T10:17:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:55 crc kubenswrapper[4698]: I0224 10:17:55.551778 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:55 crc kubenswrapper[4698]: I0224 10:17:55.551844 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:55 crc kubenswrapper[4698]: I0224 10:17:55.551868 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:55 crc kubenswrapper[4698]: I0224 10:17:55.551904 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:55 crc kubenswrapper[4698]: I0224 10:17:55.551928 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:55Z","lastTransitionTime":"2026-02-24T10:17:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:55 crc kubenswrapper[4698]: I0224 10:17:55.614510 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:17:55 crc kubenswrapper[4698]: I0224 10:17:55.614546 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:17:55 crc kubenswrapper[4698]: E0224 10:17:55.614901 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:17:55 crc kubenswrapper[4698]: E0224 10:17:55.615057 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:17:55 crc kubenswrapper[4698]: I0224 10:17:55.637424 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:55Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:55 crc kubenswrapper[4698]: I0224 10:17:55.654949 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:55 crc kubenswrapper[4698]: I0224 10:17:55.655008 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:55 crc kubenswrapper[4698]: I0224 10:17:55.655024 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:55 crc kubenswrapper[4698]: I0224 10:17:55.655048 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:55 crc kubenswrapper[4698]: I0224 10:17:55.655065 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:55Z","lastTransitionTime":"2026-02-24T10:17:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:55 crc kubenswrapper[4698]: I0224 10:17:55.659882 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:55Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:55 crc kubenswrapper[4698]: I0224 10:17:55.680179 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4539d49e9935099b59be97e672ffbe6a2a831b9261939a5afba45e16aab5c2dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:55Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:55 crc kubenswrapper[4698]: I0224 10:17:55.696855 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-nn578" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4ee0bb1-125d-4852-a54d-7dadf6177545\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67e08c23594b195088f0a11823556880d9f809097ec231acf6c4ddbcf5c085b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9ngd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0c8bc2bc5ebfb2472863808bf33f95f8aa74ed45b546ed1a1b3be4883af700e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9ngd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-nn578\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:55Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:55 crc kubenswrapper[4698]: I0224 10:17:55.718122 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7mbk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17dd9ce8-b1ca-4810-85fe-9775919eb4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac059400b5a17e1f1dc36d2fe35b5c8ace2dad5326f3933873eae644e1786c54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgnjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7mbk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:55Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:55 crc kubenswrapper[4698]: I0224 10:17:55.758231 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:55 crc kubenswrapper[4698]: I0224 10:17:55.758298 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:55 crc kubenswrapper[4698]: I0224 10:17:55.758314 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:55 crc kubenswrapper[4698]: I0224 10:17:55.758334 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:55 crc kubenswrapper[4698]: I0224 10:17:55.758347 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:55Z","lastTransitionTime":"2026-02-24T10:17:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:55 crc kubenswrapper[4698]: I0224 10:17:55.782846 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jlg97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90062989-bf1b-4479-89a0-f3bf0d438ac3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f570e20898252544de2e4987e3ec3baea2d46904749fc01664c969518d8babd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f570e20898252544de2e4987e3ec3baea2d46904749fc01664c969518d8babd6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86844171c4cdeecffa4831f9bba9b6d9c5eecbcc2220f880ccdb8819df60fa34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86844171c4cdeecffa4831f9bba9b6d9c5eecbcc2220f880ccdb8819df60fa34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42705a048e7832b1de855a97691620e572a7a7f38b90148e1cedd49003c649fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42705a048e7832b1de855a97691620e572a7a7f38b90148e1cedd49003c649fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jlg97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:55Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:55 crc kubenswrapper[4698]: I0224 10:17:55.813701 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"066df704-6981-4770-a647-df52a0da50a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://363eade2263b2108feaaf0620f7f1fd910effb90ce635e5b749b59b407618443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://363eade2263b2108feaaf0620f7f1fd910effb90ce635e5b749b59b407618443\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mgh7p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:55Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:55 crc kubenswrapper[4698]: I0224 10:17:55.826335 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:55Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:55 crc kubenswrapper[4698]: I0224 10:17:55.838478 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e70623bb6b1c9ba54ae662592cd2861cea4181853f6595a595390c81820c287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://863cee3a2b2acf3e3138d4e13d27a2b4229d619661f97eab920e5a4ee7ae2c51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:55Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:55 crc kubenswrapper[4698]: I0224 10:17:55.850793 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:55Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:55 crc kubenswrapper[4698]: I0224 10:17:55.860912 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-29rvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9cba56db-d46e-4a34-9863-47e4dce27ca5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f62a06c2933f02c75637172be87adadd015a2aad2750f553bb2e99c38fbec74b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fk9xv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-29rvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:55Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:55 crc kubenswrapper[4698]: I0224 10:17:55.861205 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:55 crc kubenswrapper[4698]: I0224 10:17:55.861326 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:55 crc kubenswrapper[4698]: I0224 10:17:55.861408 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:55 crc kubenswrapper[4698]: I0224 10:17:55.861533 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:55 crc kubenswrapper[4698]: I0224 10:17:55.861979 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:55Z","lastTransitionTime":"2026-02-24T10:17:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:55 crc kubenswrapper[4698]: I0224 10:17:55.965353 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:55 crc kubenswrapper[4698]: I0224 10:17:55.965717 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:55 crc kubenswrapper[4698]: I0224 10:17:55.965735 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:55 crc kubenswrapper[4698]: I0224 10:17:55.965759 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:55 crc kubenswrapper[4698]: I0224 10:17:55.965778 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:55Z","lastTransitionTime":"2026-02-24T10:17:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:56 crc kubenswrapper[4698]: I0224 10:17:56.074738 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:56 crc kubenswrapper[4698]: I0224 10:17:56.074786 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:56 crc kubenswrapper[4698]: I0224 10:17:56.074802 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:56 crc kubenswrapper[4698]: I0224 10:17:56.074826 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:56 crc kubenswrapper[4698]: I0224 10:17:56.074843 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:56Z","lastTransitionTime":"2026-02-24T10:17:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:56 crc kubenswrapper[4698]: I0224 10:17:56.098865 4698 generic.go:334] "Generic (PLEG): container finished" podID="90062989-bf1b-4479-89a0-f3bf0d438ac3" containerID="5968e3b94b9d8996e9c4d4fdfab0576fcee049356dff5defd85f1a71ab652c41" exitCode=0 Feb 24 10:17:56 crc kubenswrapper[4698]: I0224 10:17:56.098916 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jlg97" event={"ID":"90062989-bf1b-4479-89a0-f3bf0d438ac3","Type":"ContainerDied","Data":"5968e3b94b9d8996e9c4d4fdfab0576fcee049356dff5defd85f1a71ab652c41"} Feb 24 10:17:56 crc kubenswrapper[4698]: I0224 10:17:56.102027 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"b70223850a461f607af8055fb157db676ed4dd9537481c41f21b8b85dc955c6f"} Feb 24 10:17:56 crc kubenswrapper[4698]: I0224 10:17:56.122726 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:56Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:56 crc kubenswrapper[4698]: I0224 10:17:56.146797 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e70623bb6b1c9ba54ae662592cd2861cea4181853f6595a595390c81820c287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://863cee3a2b2acf3e3138d4e13d27a2b4229d619661f97eab920e5a4ee7ae2c51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:56Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:56 crc kubenswrapper[4698]: I0224 10:17:56.169853 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:56Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:56 crc kubenswrapper[4698]: I0224 10:17:56.177509 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:56 crc kubenswrapper[4698]: I0224 10:17:56.177556 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:56 crc kubenswrapper[4698]: I0224 10:17:56.177576 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:56 crc kubenswrapper[4698]: I0224 10:17:56.177601 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:56 crc kubenswrapper[4698]: I0224 10:17:56.177618 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:56Z","lastTransitionTime":"2026-02-24T10:17:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:56 crc kubenswrapper[4698]: I0224 10:17:56.191914 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jlg97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90062989-bf1b-4479-89a0-f3bf0d438ac3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f570e20898252544de2e4987e3ec3baea2d46904749fc01664c969518d8babd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f570e20898252544de2e4987e3ec3baea2d46904749fc01664c969518d8babd6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86844171c4cdeecffa4831f9bba9b6d9c5eecbcc2220f880ccdb8819df60fa34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86844171c4cdeecffa4831f9bba9b6d9c5eecbcc2220f880ccdb8819df60fa34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42705a048e7832b1de855a97691620e572a7a7f38b90148e1cedd49003c649fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42705a048e7832b1de855a97691620e572a7a7f38b90148e1cedd49003c649fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5968e3b94b9d8996e9c4d4fdfab0576fcee049356dff5defd85f1a71ab652c41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5968e3b94b9d8996e9c4d4fdfab0576fcee049356dff5defd85f1a71ab652c41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jlg97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:56Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:56 crc kubenswrapper[4698]: I0224 10:17:56.225852 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"066df704-6981-4770-a647-df52a0da50a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://363eade2263b2108feaaf0620f7f1fd910effb90ce635e5b749b59b407618443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://363eade2263b2108feaaf0620f7f1fd910effb90ce635e5b749b59b407618443\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mgh7p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:56Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:56 crc kubenswrapper[4698]: I0224 10:17:56.240193 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-29rvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9cba56db-d46e-4a34-9863-47e4dce27ca5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f62a06c2933f02c75637172be87adadd015a2aad2750f553bb2e99c38fbec74b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fk9xv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-29rvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:56Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:56 crc kubenswrapper[4698]: I0224 10:17:56.262513 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:56Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:56 crc kubenswrapper[4698]: I0224 10:17:56.277336 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:56Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:56 crc kubenswrapper[4698]: I0224 10:17:56.280845 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:56 crc kubenswrapper[4698]: I0224 10:17:56.280885 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:56 crc kubenswrapper[4698]: I0224 10:17:56.280896 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:56 crc kubenswrapper[4698]: I0224 10:17:56.280916 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:56 crc kubenswrapper[4698]: I0224 10:17:56.280928 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:56Z","lastTransitionTime":"2026-02-24T10:17:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:56 crc kubenswrapper[4698]: I0224 10:17:56.291602 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4539d49e9935099b59be97e672ffbe6a2a831b9261939a5afba45e16aab5c2dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:56Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:56 crc kubenswrapper[4698]: I0224 10:17:56.305476 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-nn578" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4ee0bb1-125d-4852-a54d-7dadf6177545\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67e08c23594b195088f0a11823556880d9f809097ec231acf6c4ddbcf5c085b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9ngd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0c8bc2bc5ebfb2472863808bf33f95f8aa74ed45b546ed1a1b3be4883af700e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9ngd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-nn578\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:56Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:56 crc kubenswrapper[4698]: I0224 10:17:56.325688 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7mbk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17dd9ce8-b1ca-4810-85fe-9775919eb4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac059400b5a17e1f1dc36d2fe35b5c8ace2dad5326f3933873eae644e1786c54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgnjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7mbk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:56Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:56 crc kubenswrapper[4698]: I0224 10:17:56.342405 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:56Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:56 crc kubenswrapper[4698]: I0224 10:17:56.357519 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4539d49e9935099b59be97e672ffbe6a2a831b9261939a5afba45e16aab5c2dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:56Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:56 crc kubenswrapper[4698]: I0224 10:17:56.375942 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-nn578" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4ee0bb1-125d-4852-a54d-7dadf6177545\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67e08c23594b195088f0a11823556880d9f809097ec231acf6c4ddbcf5c085b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9ngd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0c8bc2bc5ebfb2472863808bf33f95f8aa74ed45b546ed1a1b3be4883af700e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9ngd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-nn578\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:56Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:56 crc kubenswrapper[4698]: I0224 10:17:56.382622 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:56 crc kubenswrapper[4698]: I0224 10:17:56.382662 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:56 crc kubenswrapper[4698]: I0224 10:17:56.382673 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:56 crc kubenswrapper[4698]: I0224 10:17:56.382690 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:56 crc kubenswrapper[4698]: I0224 10:17:56.382699 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:56Z","lastTransitionTime":"2026-02-24T10:17:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:56 crc kubenswrapper[4698]: I0224 10:17:56.391842 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7mbk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17dd9ce8-b1ca-4810-85fe-9775919eb4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac059400b5a17e1f1dc36d2fe35b5c8ace2dad5326f3933873eae644e1786c54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgnjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7mbk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:56Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:56 crc kubenswrapper[4698]: I0224 10:17:56.426843 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"066df704-6981-4770-a647-df52a0da50a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://363eade2263b2108feaaf0620f7f1fd910effb90ce635e5b749b59b407618443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://363eade2263b2108feaaf0620f7f1fd910effb90ce635e5b749b59b407618443\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mgh7p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:56Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:56 crc kubenswrapper[4698]: I0224 10:17:56.444352 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b70223850a461f607af8055fb157db676ed4dd9537481c41f21b8b85dc955c6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:56Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:56 crc kubenswrapper[4698]: I0224 10:17:56.458044 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e70623bb6b1c9ba54ae662592cd2861cea4181853f6595a595390c81820c287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://863cee3a2b2acf3e3138d4e13d27a2b4229d619661f97eab920e5a4ee7ae2c51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:56Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:56 crc kubenswrapper[4698]: I0224 10:17:56.477598 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:56Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:56 crc kubenswrapper[4698]: I0224 10:17:56.485623 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:56 crc kubenswrapper[4698]: I0224 10:17:56.485671 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:56 crc kubenswrapper[4698]: I0224 10:17:56.485687 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:56 crc kubenswrapper[4698]: I0224 10:17:56.485714 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:56 crc kubenswrapper[4698]: I0224 10:17:56.485732 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:56Z","lastTransitionTime":"2026-02-24T10:17:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:56 crc kubenswrapper[4698]: I0224 10:17:56.503684 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jlg97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90062989-bf1b-4479-89a0-f3bf0d438ac3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f570e20898252544de2e4987e3ec3baea2d46904749fc01664c969518d8babd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f570e20898252544de2e4987e3ec3baea2d46904749fc01664c969518d8babd6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86844171c4cdeecffa4831f9bba9b6d9c5eecbcc2220f880ccdb8819df60fa34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86844171c4cdeecffa4831f9bba9b6d9c5eecbcc2220f880ccdb8819df60fa34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42705a048e7832b1de855a97691620e572a7a7f38b90148e1cedd49003c649fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42705a048e7832b1de855a97691620e572a7a7f38b90148e1cedd49003c649fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5968e3b94b9d8996e9c4d4fdfab0576fcee049356dff5defd85f1a71ab652c41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5968e3b94b9d8996e9c4d4fdfab0576fcee049356dff5defd85f1a71ab652c41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jlg97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:56Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:56 crc kubenswrapper[4698]: I0224 10:17:56.519750 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-29rvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9cba56db-d46e-4a34-9863-47e4dce27ca5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f62a06c2933f02c75637172be87adadd015a2aad2750f553bb2e99c38fbec74b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fk9xv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-29rvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:56Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:56 crc kubenswrapper[4698]: I0224 10:17:56.540628 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:56Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:56 crc kubenswrapper[4698]: I0224 10:17:56.588935 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:56 crc kubenswrapper[4698]: I0224 10:17:56.589008 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:56 crc kubenswrapper[4698]: I0224 10:17:56.589028 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:56 crc kubenswrapper[4698]: I0224 10:17:56.589051 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:56 crc kubenswrapper[4698]: I0224 10:17:56.589068 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:56Z","lastTransitionTime":"2026-02-24T10:17:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:56 crc kubenswrapper[4698]: I0224 10:17:56.614168 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:17:56 crc kubenswrapper[4698]: E0224 10:17:56.614382 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:17:56 crc kubenswrapper[4698]: I0224 10:17:56.691884 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:56 crc kubenswrapper[4698]: I0224 10:17:56.691961 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:56 crc kubenswrapper[4698]: I0224 10:17:56.691988 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:56 crc kubenswrapper[4698]: I0224 10:17:56.692020 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:56 crc kubenswrapper[4698]: I0224 10:17:56.692042 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:56Z","lastTransitionTime":"2026-02-24T10:17:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:56 crc kubenswrapper[4698]: I0224 10:17:56.794047 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:56 crc kubenswrapper[4698]: I0224 10:17:56.794086 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:56 crc kubenswrapper[4698]: I0224 10:17:56.794096 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:56 crc kubenswrapper[4698]: I0224 10:17:56.794113 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:56 crc kubenswrapper[4698]: I0224 10:17:56.794124 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:56Z","lastTransitionTime":"2026-02-24T10:17:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:56 crc kubenswrapper[4698]: I0224 10:17:56.896466 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:56 crc kubenswrapper[4698]: I0224 10:17:56.896511 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:56 crc kubenswrapper[4698]: I0224 10:17:56.896523 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:56 crc kubenswrapper[4698]: I0224 10:17:56.896539 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:56 crc kubenswrapper[4698]: I0224 10:17:56.896550 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:56Z","lastTransitionTime":"2026-02-24T10:17:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:56 crc kubenswrapper[4698]: I0224 10:17:56.999238 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:56 crc kubenswrapper[4698]: I0224 10:17:56.999307 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:56 crc kubenswrapper[4698]: I0224 10:17:56.999321 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:56 crc kubenswrapper[4698]: I0224 10:17:56.999341 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:56 crc kubenswrapper[4698]: I0224 10:17:56.999354 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:56Z","lastTransitionTime":"2026-02-24T10:17:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:57 crc kubenswrapper[4698]: I0224 10:17:57.101385 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:57 crc kubenswrapper[4698]: I0224 10:17:57.101423 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:57 crc kubenswrapper[4698]: I0224 10:17:57.101434 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:57 crc kubenswrapper[4698]: I0224 10:17:57.101451 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:57 crc kubenswrapper[4698]: I0224 10:17:57.101461 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:57Z","lastTransitionTime":"2026-02-24T10:17:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:57 crc kubenswrapper[4698]: I0224 10:17:57.107679 4698 generic.go:334] "Generic (PLEG): container finished" podID="90062989-bf1b-4479-89a0-f3bf0d438ac3" containerID="05bd18aaa2469fc7380f98a513907e098a1cd45c794dae35894dc4caccaaeac8" exitCode=0 Feb 24 10:17:57 crc kubenswrapper[4698]: I0224 10:17:57.107766 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jlg97" event={"ID":"90062989-bf1b-4479-89a0-f3bf0d438ac3","Type":"ContainerDied","Data":"05bd18aaa2469fc7380f98a513907e098a1cd45c794dae35894dc4caccaaeac8"} Feb 24 10:17:57 crc kubenswrapper[4698]: I0224 10:17:57.117541 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" event={"ID":"066df704-6981-4770-a647-df52a0da50a0","Type":"ContainerStarted","Data":"1288272246b8937c2880153451d797fc3328749902e2491e60c8f8f086c85288"} Feb 24 10:17:57 crc kubenswrapper[4698]: I0224 10:17:57.129888 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:57Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:57 crc kubenswrapper[4698]: I0224 10:17:57.162646 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:57Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:57 crc kubenswrapper[4698]: I0224 10:17:57.191688 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4539d49e9935099b59be97e672ffbe6a2a831b9261939a5afba45e16aab5c2dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:57Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:57 crc kubenswrapper[4698]: I0224 10:17:57.205730 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:57 crc kubenswrapper[4698]: I0224 10:17:57.205824 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:57 crc kubenswrapper[4698]: I0224 10:17:57.205882 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:57 crc kubenswrapper[4698]: I0224 10:17:57.205909 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:57 crc kubenswrapper[4698]: I0224 10:17:57.205926 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:57Z","lastTransitionTime":"2026-02-24T10:17:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:57 crc kubenswrapper[4698]: I0224 10:17:57.212052 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-nn578" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4ee0bb1-125d-4852-a54d-7dadf6177545\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67e08c23594b195088f0a11823556880d9f809097ec231acf6c4ddbcf5c085b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9ngd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0c8bc2bc5ebfb2472863808bf33f95f8aa74ed45b546ed1a1b3be4883af700e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9ngd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-nn578\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:57Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:57 crc kubenswrapper[4698]: I0224 10:17:57.236419 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7mbk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17dd9ce8-b1ca-4810-85fe-9775919eb4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac059400b5a17e1f1dc36d2fe35b5c8ace2dad5326f3933873eae644e1786c54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgnjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7mbk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:57Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:57 crc kubenswrapper[4698]: I0224 10:17:57.252211 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b70223850a461f607af8055fb157db676ed4dd9537481c41f21b8b85dc955c6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:57Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:57 crc kubenswrapper[4698]: I0224 10:17:57.271035 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e70623bb6b1c9ba54ae662592cd2861cea4181853f6595a595390c81820c287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://863cee3a2b2acf3e3138d4e13d27a2b4229d619661f97eab920e5a4ee7ae2c51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:57Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:57 crc kubenswrapper[4698]: I0224 10:17:57.285753 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:57Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:57 crc kubenswrapper[4698]: I0224 10:17:57.300590 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jlg97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90062989-bf1b-4479-89a0-f3bf0d438ac3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f570e20898252544de2e4987e3ec3baea2d46904749fc01664c969518d8babd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f570e20898252544de2e4987e3ec3baea2d46904749fc01664c969518d8babd6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86844171c4cdeecffa4831f9bba9b6d9c5eecbcc2220f880ccdb8819df60fa34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86844171c4cdeecffa4831f9bba9b6d9c5eecbcc2220f880ccdb8819df60fa34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42705a048e7832b1de855a97691620e572a7a7f38b90148e1cedd49003c649fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42705a048e7832b1de855a97691620e572a7a7f38b90148e1cedd49003c649fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5968e3b94b9d8996e9c4d4fdfab0576fcee049356dff5defd85f1a71ab652c41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5968e3b94b9d8996e9c4d4fdfab0576fcee049356dff5defd85f1a71ab652c41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05bd18aaa2469fc7380f98a513907e098a1cd45c794dae35894dc4caccaaeac8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05bd18aaa2469fc7380f98a513907e098a1cd45c794dae35894dc4caccaaeac8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jlg97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:57Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:57 crc kubenswrapper[4698]: I0224 10:17:57.308704 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:57 crc kubenswrapper[4698]: I0224 10:17:57.308777 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:57 crc kubenswrapper[4698]: I0224 10:17:57.308854 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:57 crc kubenswrapper[4698]: I0224 10:17:57.308881 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:57 crc kubenswrapper[4698]: I0224 10:17:57.308900 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:57Z","lastTransitionTime":"2026-02-24T10:17:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:57 crc kubenswrapper[4698]: I0224 10:17:57.323368 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"066df704-6981-4770-a647-df52a0da50a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://363eade2263b2108feaaf0620f7f1fd910effb90ce635e5b749b59b407618443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://363eade2263b2108feaaf0620f7f1fd910effb90ce635e5b749b59b407618443\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mgh7p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:57Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:57 crc kubenswrapper[4698]: I0224 10:17:57.336519 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-29rvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9cba56db-d46e-4a34-9863-47e4dce27ca5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f62a06c2933f02c75637172be87adadd015a2aad2750f553bb2e99c38fbec74b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fk9xv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-29rvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:57Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:57 crc kubenswrapper[4698]: I0224 10:17:57.413963 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:57 crc kubenswrapper[4698]: I0224 10:17:57.414025 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:57 crc kubenswrapper[4698]: I0224 10:17:57.414046 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:57 crc kubenswrapper[4698]: I0224 10:17:57.414072 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:57 crc kubenswrapper[4698]: I0224 10:17:57.414091 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:57Z","lastTransitionTime":"2026-02-24T10:17:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:57 crc kubenswrapper[4698]: I0224 10:17:57.517913 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:57 crc kubenswrapper[4698]: I0224 10:17:57.517980 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:57 crc kubenswrapper[4698]: I0224 10:17:57.518001 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:57 crc kubenswrapper[4698]: I0224 10:17:57.518029 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:57 crc kubenswrapper[4698]: I0224 10:17:57.518047 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:57Z","lastTransitionTime":"2026-02-24T10:17:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:57 crc kubenswrapper[4698]: I0224 10:17:57.542639 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:57 crc kubenswrapper[4698]: I0224 10:17:57.542683 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:57 crc kubenswrapper[4698]: I0224 10:17:57.542701 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:57 crc kubenswrapper[4698]: I0224 10:17:57.542724 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:57 crc kubenswrapper[4698]: I0224 10:17:57.542739 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:57Z","lastTransitionTime":"2026-02-24T10:17:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:57 crc kubenswrapper[4698]: E0224 10:17:57.566606 4698 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:17:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:17:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:17:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:17:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b118f46-32f0-479c-9931-37b2bbb76922\\\",\\\"systemUUID\\\":\\\"b9d2441b-c8c3-476a-9c48-bba682d9b98e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:57Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:57 crc kubenswrapper[4698]: I0224 10:17:57.571798 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:57 crc kubenswrapper[4698]: I0224 10:17:57.571871 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:57 crc kubenswrapper[4698]: I0224 10:17:57.571894 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:57 crc kubenswrapper[4698]: I0224 10:17:57.571926 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:57 crc kubenswrapper[4698]: I0224 10:17:57.571948 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:57Z","lastTransitionTime":"2026-02-24T10:17:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:57 crc kubenswrapper[4698]: E0224 10:17:57.591413 4698 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:17:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:17:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:17:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:17:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b118f46-32f0-479c-9931-37b2bbb76922\\\",\\\"systemUUID\\\":\\\"b9d2441b-c8c3-476a-9c48-bba682d9b98e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:57Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:57 crc kubenswrapper[4698]: I0224 10:17:57.596253 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:57 crc kubenswrapper[4698]: I0224 10:17:57.596344 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:57 crc kubenswrapper[4698]: I0224 10:17:57.596367 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:57 crc kubenswrapper[4698]: I0224 10:17:57.596398 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:57 crc kubenswrapper[4698]: I0224 10:17:57.596418 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:57Z","lastTransitionTime":"2026-02-24T10:17:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:57 crc kubenswrapper[4698]: I0224 10:17:57.614576 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:17:57 crc kubenswrapper[4698]: E0224 10:17:57.614740 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:17:57 crc kubenswrapper[4698]: I0224 10:17:57.614575 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:17:57 crc kubenswrapper[4698]: E0224 10:17:57.615241 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:17:57 crc kubenswrapper[4698]: E0224 10:17:57.618079 4698 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:17:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:17:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:17:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:17:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b118f46-32f0-479c-9931-37b2bbb76922\\\",\\\"systemUUID\\\":\\\"b9d2441b-c8c3-476a-9c48-bba682d9b98e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:57Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:57 crc kubenswrapper[4698]: I0224 10:17:57.624089 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:57 crc kubenswrapper[4698]: I0224 10:17:57.624146 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:57 crc kubenswrapper[4698]: I0224 10:17:57.624164 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:57 crc kubenswrapper[4698]: I0224 10:17:57.624206 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:57 crc kubenswrapper[4698]: I0224 10:17:57.624224 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:57Z","lastTransitionTime":"2026-02-24T10:17:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:57 crc kubenswrapper[4698]: E0224 10:17:57.645088 4698 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:17:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:17:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:17:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:17:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b118f46-32f0-479c-9931-37b2bbb76922\\\",\\\"systemUUID\\\":\\\"b9d2441b-c8c3-476a-9c48-bba682d9b98e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:57Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:57 crc kubenswrapper[4698]: I0224 10:17:57.649846 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:57 crc kubenswrapper[4698]: I0224 10:17:57.649897 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:57 crc kubenswrapper[4698]: I0224 10:17:57.649915 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:57 crc kubenswrapper[4698]: I0224 10:17:57.649938 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:57 crc kubenswrapper[4698]: I0224 10:17:57.649954 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:57Z","lastTransitionTime":"2026-02-24T10:17:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:57 crc kubenswrapper[4698]: E0224 10:17:57.670972 4698 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:17:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:17:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:17:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:17:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b118f46-32f0-479c-9931-37b2bbb76922\\\",\\\"systemUUID\\\":\\\"b9d2441b-c8c3-476a-9c48-bba682d9b98e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:57Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:57 crc kubenswrapper[4698]: E0224 10:17:57.671187 4698 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 24 10:17:57 crc kubenswrapper[4698]: I0224 10:17:57.673440 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:57 crc kubenswrapper[4698]: I0224 10:17:57.673488 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:57 crc kubenswrapper[4698]: I0224 10:17:57.673505 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:57 crc kubenswrapper[4698]: I0224 10:17:57.673529 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:57 crc kubenswrapper[4698]: I0224 10:17:57.673547 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:57Z","lastTransitionTime":"2026-02-24T10:17:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:57 crc kubenswrapper[4698]: I0224 10:17:57.776365 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:57 crc kubenswrapper[4698]: I0224 10:17:57.776420 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:57 crc kubenswrapper[4698]: I0224 10:17:57.776438 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:57 crc kubenswrapper[4698]: I0224 10:17:57.776463 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:57 crc kubenswrapper[4698]: I0224 10:17:57.776481 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:57Z","lastTransitionTime":"2026-02-24T10:17:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:57 crc kubenswrapper[4698]: I0224 10:17:57.879713 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:57 crc kubenswrapper[4698]: I0224 10:17:57.879780 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:57 crc kubenswrapper[4698]: I0224 10:17:57.879800 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:57 crc kubenswrapper[4698]: I0224 10:17:57.879827 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:57 crc kubenswrapper[4698]: I0224 10:17:57.879846 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:57Z","lastTransitionTime":"2026-02-24T10:17:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:57 crc kubenswrapper[4698]: I0224 10:17:57.983108 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:57 crc kubenswrapper[4698]: I0224 10:17:57.983165 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:57 crc kubenswrapper[4698]: I0224 10:17:57.983188 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:57 crc kubenswrapper[4698]: I0224 10:17:57.983220 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:57 crc kubenswrapper[4698]: I0224 10:17:57.983241 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:57Z","lastTransitionTime":"2026-02-24T10:17:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:58 crc kubenswrapper[4698]: I0224 10:17:58.086561 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:58 crc kubenswrapper[4698]: I0224 10:17:58.086669 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:58 crc kubenswrapper[4698]: I0224 10:17:58.086690 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:58 crc kubenswrapper[4698]: I0224 10:17:58.086714 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:58 crc kubenswrapper[4698]: I0224 10:17:58.086730 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:58Z","lastTransitionTime":"2026-02-24T10:17:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:58 crc kubenswrapper[4698]: I0224 10:17:58.128317 4698 generic.go:334] "Generic (PLEG): container finished" podID="90062989-bf1b-4479-89a0-f3bf0d438ac3" containerID="7c47b55214c6082bb9f8a18705983f9be95ef4c3b557d2d8f6cb8a33fa1fddd2" exitCode=0 Feb 24 10:17:58 crc kubenswrapper[4698]: I0224 10:17:58.128411 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jlg97" event={"ID":"90062989-bf1b-4479-89a0-f3bf0d438ac3","Type":"ContainerDied","Data":"7c47b55214c6082bb9f8a18705983f9be95ef4c3b557d2d8f6cb8a33fa1fddd2"} Feb 24 10:17:58 crc kubenswrapper[4698]: I0224 10:17:58.148745 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b70223850a461f607af8055fb157db676ed4dd9537481c41f21b8b85dc955c6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:58Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:58 crc kubenswrapper[4698]: I0224 10:17:58.169566 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e70623bb6b1c9ba54ae662592cd2861cea4181853f6595a595390c81820c287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://863cee3a2b2acf3e3138d4e13d27a2b4229d619661f97eab920e5a4ee7ae2c51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:58Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:58 crc kubenswrapper[4698]: I0224 10:17:58.187608 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:58Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:58 crc kubenswrapper[4698]: I0224 10:17:58.189578 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:58 crc kubenswrapper[4698]: I0224 10:17:58.189653 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:58 crc kubenswrapper[4698]: I0224 10:17:58.189732 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:58 crc kubenswrapper[4698]: I0224 10:17:58.189771 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:58 crc kubenswrapper[4698]: I0224 10:17:58.189793 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:58Z","lastTransitionTime":"2026-02-24T10:17:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:58 crc kubenswrapper[4698]: I0224 10:17:58.211844 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jlg97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90062989-bf1b-4479-89a0-f3bf0d438ac3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f570e20898252544de2e4987e3ec3baea2d46904749fc01664c969518d8babd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f570e20898252544de2e4987e3ec3baea2d46904749fc01664c969518d8babd6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86844171c4cdeecffa4831f9bba9b6d9c5eecbcc2220f880ccdb8819df60fa34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86844171c4cdeecffa4831f9bba9b6d9c5eecbcc2220f880ccdb8819df60fa34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42705a048e7832b1de855a97691620e572a7a7f38b90148e1cedd49003c649fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42705a048e7832b1de855a97691620e572a7a7f38b90148e1cedd49003c649fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5968e3b94b9d8996e9c4d4fdfab0576fcee049356dff5defd85f1a71ab652c41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5968e3b94b9d8996e9c4d4fdfab0576fcee049356dff5defd85f1a71ab652c41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05bd18aaa2469fc7380f98a513907e098a1cd45c794dae35894dc4caccaaeac8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05bd18aaa2469fc7380f98a513907e098a1cd45c794dae35894dc4caccaaeac8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c47b55214c6082bb9f8a18705983f9be95ef4c3b557d2d8f6cb8a33fa1fddd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c47b55214c6082bb9f8a18705983f9be95ef4c3b557d2d8f6cb8a33fa1fddd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jlg97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:58Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:58 crc kubenswrapper[4698]: I0224 10:17:58.246978 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"066df704-6981-4770-a647-df52a0da50a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://363eade2263b2108feaaf0620f7f1fd910effb90ce635e5b749b59b407618443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://363eade2263b2108feaaf0620f7f1fd910effb90ce635e5b749b59b407618443\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mgh7p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:58Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:58 crc kubenswrapper[4698]: I0224 10:17:58.263731 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-29rvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9cba56db-d46e-4a34-9863-47e4dce27ca5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f62a06c2933f02c75637172be87adadd015a2aad2750f553bb2e99c38fbec74b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fk9xv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-29rvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:58Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:58 crc kubenswrapper[4698]: I0224 10:17:58.284129 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:58Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:58 crc kubenswrapper[4698]: I0224 10:17:58.294166 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:58 crc kubenswrapper[4698]: I0224 10:17:58.294218 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:58 crc kubenswrapper[4698]: I0224 10:17:58.294236 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:58 crc kubenswrapper[4698]: I0224 10:17:58.294285 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:58 crc kubenswrapper[4698]: I0224 10:17:58.294304 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:58Z","lastTransitionTime":"2026-02-24T10:17:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:58 crc kubenswrapper[4698]: I0224 10:17:58.304982 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:58Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:58 crc kubenswrapper[4698]: I0224 10:17:58.327393 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4539d49e9935099b59be97e672ffbe6a2a831b9261939a5afba45e16aab5c2dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:58Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:58 crc kubenswrapper[4698]: I0224 10:17:58.352654 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-nn578" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4ee0bb1-125d-4852-a54d-7dadf6177545\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67e08c23594b195088f0a11823556880d9f809097ec231acf6c4ddbcf5c085b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9ngd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0c8bc2bc5ebfb2472863808bf33f95f8aa74ed45b546ed1a1b3be4883af700e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9ngd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-nn578\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:58Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:58 crc kubenswrapper[4698]: I0224 10:17:58.376334 4698 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-mb4d7"] Feb 24 10:17:58 crc kubenswrapper[4698]: I0224 10:17:58.376875 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-mb4d7" Feb 24 10:17:58 crc kubenswrapper[4698]: I0224 10:17:58.379092 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 24 10:17:58 crc kubenswrapper[4698]: I0224 10:17:58.379969 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7mbk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17dd9ce8-b1ca-4810-85fe-9775919eb4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac059400b5a17e1f1dc36d2fe35b5c8ace2dad5326f3933873eae644e1786c54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgnjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7mbk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:58Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:58 crc kubenswrapper[4698]: I0224 10:17:58.380820 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 24 10:17:58 crc kubenswrapper[4698]: I0224 10:17:58.380932 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 24 10:17:58 crc kubenswrapper[4698]: I0224 10:17:58.381082 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 24 10:17:58 crc kubenswrapper[4698]: I0224 10:17:58.398602 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:58 crc kubenswrapper[4698]: I0224 10:17:58.398646 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:58 crc kubenswrapper[4698]: I0224 10:17:58.398666 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:58 crc kubenswrapper[4698]: I0224 10:17:58.398691 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:58 crc kubenswrapper[4698]: I0224 10:17:58.398708 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:58Z","lastTransitionTime":"2026-02-24T10:17:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:58 crc kubenswrapper[4698]: I0224 10:17:58.404383 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:58Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:58 crc kubenswrapper[4698]: I0224 10:17:58.427567 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:58Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:58 crc kubenswrapper[4698]: I0224 10:17:58.451333 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4539d49e9935099b59be97e672ffbe6a2a831b9261939a5afba45e16aab5c2dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:58Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:58 crc kubenswrapper[4698]: I0224 10:17:58.463595 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fc3c474c-e869-4b47-94c5-f1ab3ce3c843-host\") pod \"node-ca-mb4d7\" (UID: \"fc3c474c-e869-4b47-94c5-f1ab3ce3c843\") " pod="openshift-image-registry/node-ca-mb4d7" Feb 24 10:17:58 crc kubenswrapper[4698]: I0224 10:17:58.463681 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/fc3c474c-e869-4b47-94c5-f1ab3ce3c843-serviceca\") pod \"node-ca-mb4d7\" (UID: \"fc3c474c-e869-4b47-94c5-f1ab3ce3c843\") " pod="openshift-image-registry/node-ca-mb4d7" Feb 24 10:17:58 crc kubenswrapper[4698]: I0224 10:17:58.463761 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2d8kb\" (UniqueName: \"kubernetes.io/projected/fc3c474c-e869-4b47-94c5-f1ab3ce3c843-kube-api-access-2d8kb\") pod \"node-ca-mb4d7\" (UID: \"fc3c474c-e869-4b47-94c5-f1ab3ce3c843\") " pod="openshift-image-registry/node-ca-mb4d7" Feb 24 10:17:58 crc kubenswrapper[4698]: I0224 10:17:58.470121 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-nn578" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4ee0bb1-125d-4852-a54d-7dadf6177545\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67e08c23594b195088f0a11823556880d9f809097ec231acf6c4ddbcf5c085b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9ngd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0c8bc2bc5ebfb2472863808bf33f95f8aa74ed45b546ed1a1b3be4883af700e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9ngd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-nn578\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:58Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:58 crc kubenswrapper[4698]: I0224 10:17:58.490689 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7mbk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17dd9ce8-b1ca-4810-85fe-9775919eb4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac059400b5a17e1f1dc36d2fe35b5c8ace2dad5326f3933873eae644e1786c54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgnjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7mbk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:58Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:58 crc kubenswrapper[4698]: I0224 10:17:58.503029 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:58 crc kubenswrapper[4698]: I0224 10:17:58.503074 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:58 crc kubenswrapper[4698]: I0224 10:17:58.503087 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:58 crc kubenswrapper[4698]: I0224 10:17:58.503106 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:58 crc kubenswrapper[4698]: I0224 10:17:58.503121 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:58Z","lastTransitionTime":"2026-02-24T10:17:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:58 crc kubenswrapper[4698]: I0224 10:17:58.509805 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b70223850a461f607af8055fb157db676ed4dd9537481c41f21b8b85dc955c6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:58Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:58 crc kubenswrapper[4698]: I0224 10:17:58.530656 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e70623bb6b1c9ba54ae662592cd2861cea4181853f6595a595390c81820c287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://863cee3a2b2acf3e3138d4e13d27a2b4229d619661f97eab920e5a4ee7ae2c51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:58Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:58 crc kubenswrapper[4698]: I0224 10:17:58.551636 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:58Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:58 crc kubenswrapper[4698]: I0224 10:17:58.565424 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fc3c474c-e869-4b47-94c5-f1ab3ce3c843-host\") pod \"node-ca-mb4d7\" (UID: \"fc3c474c-e869-4b47-94c5-f1ab3ce3c843\") " pod="openshift-image-registry/node-ca-mb4d7" Feb 24 10:17:58 crc kubenswrapper[4698]: I0224 10:17:58.565521 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/fc3c474c-e869-4b47-94c5-f1ab3ce3c843-serviceca\") pod \"node-ca-mb4d7\" (UID: \"fc3c474c-e869-4b47-94c5-f1ab3ce3c843\") " pod="openshift-image-registry/node-ca-mb4d7" Feb 24 10:17:58 crc kubenswrapper[4698]: I0224 10:17:58.565579 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fc3c474c-e869-4b47-94c5-f1ab3ce3c843-host\") pod \"node-ca-mb4d7\" (UID: \"fc3c474c-e869-4b47-94c5-f1ab3ce3c843\") " pod="openshift-image-registry/node-ca-mb4d7" Feb 24 10:17:58 crc kubenswrapper[4698]: I0224 10:17:58.565590 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2d8kb\" (UniqueName: \"kubernetes.io/projected/fc3c474c-e869-4b47-94c5-f1ab3ce3c843-kube-api-access-2d8kb\") pod \"node-ca-mb4d7\" (UID: \"fc3c474c-e869-4b47-94c5-f1ab3ce3c843\") " pod="openshift-image-registry/node-ca-mb4d7" Feb 24 10:17:58 crc kubenswrapper[4698]: I0224 10:17:58.567313 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/fc3c474c-e869-4b47-94c5-f1ab3ce3c843-serviceca\") pod \"node-ca-mb4d7\" (UID: \"fc3c474c-e869-4b47-94c5-f1ab3ce3c843\") " pod="openshift-image-registry/node-ca-mb4d7" Feb 24 10:17:58 crc kubenswrapper[4698]: I0224 10:17:58.580803 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jlg97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90062989-bf1b-4479-89a0-f3bf0d438ac3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f570e20898252544de2e4987e3ec3baea2d46904749fc01664c969518d8babd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f570e20898252544de2e4987e3ec3baea2d46904749fc01664c969518d8babd6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86844171c4cdeecffa4831f9bba9b6d9c5eecbcc2220f880ccdb8819df60fa34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86844171c4cdeecffa4831f9bba9b6d9c5eecbcc2220f880ccdb8819df60fa34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42705a048e7832b1de855a97691620e572a7a7f38b90148e1cedd49003c649fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42705a048e7832b1de855a97691620e572a7a7f38b90148e1cedd49003c649fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5968e3b94b9d8996e9c4d4fdfab0576fcee049356dff5defd85f1a71ab652c41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5968e3b94b9d8996e9c4d4fdfab0576fcee049356dff5defd85f1a71ab652c41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05bd18aaa2469fc7380f98a513907e098a1cd45c794dae35894dc4caccaaeac8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05bd18aaa2469fc7380f98a513907e098a1cd45c794dae35894dc4caccaaeac8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c47b55214c6082bb9f8a18705983f9be95ef4c3b557d2d8f6cb8a33fa1fddd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c47b55214c6082bb9f8a18705983f9be95ef4c3b557d2d8f6cb8a33fa1fddd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jlg97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:58Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:58 crc kubenswrapper[4698]: I0224 10:17:58.593102 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2d8kb\" (UniqueName: \"kubernetes.io/projected/fc3c474c-e869-4b47-94c5-f1ab3ce3c843-kube-api-access-2d8kb\") pod \"node-ca-mb4d7\" (UID: \"fc3c474c-e869-4b47-94c5-f1ab3ce3c843\") " pod="openshift-image-registry/node-ca-mb4d7" Feb 24 10:17:58 crc kubenswrapper[4698]: I0224 10:17:58.610378 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:58 crc kubenswrapper[4698]: I0224 10:17:58.610433 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:58 crc kubenswrapper[4698]: I0224 10:17:58.610449 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:58 crc kubenswrapper[4698]: I0224 10:17:58.610477 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:58 crc kubenswrapper[4698]: I0224 10:17:58.610495 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:58Z","lastTransitionTime":"2026-02-24T10:17:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:58 crc kubenswrapper[4698]: I0224 10:17:58.613714 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:17:58 crc kubenswrapper[4698]: E0224 10:17:58.613934 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:17:58 crc kubenswrapper[4698]: I0224 10:17:58.615848 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"066df704-6981-4770-a647-df52a0da50a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://363eade2263b2108feaaf0620f7f1fd910effb90ce635e5b749b59b407618443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://363eade2263b2108feaaf0620f7f1fd910effb90ce635e5b749b59b407618443\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mgh7p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:58Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:58 crc kubenswrapper[4698]: I0224 10:17:58.630964 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mb4d7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc3c474c-e869-4b47-94c5-f1ab3ce3c843\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:58Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:58Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d8kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mb4d7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:58Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:58 crc kubenswrapper[4698]: I0224 10:17:58.645124 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-29rvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9cba56db-d46e-4a34-9863-47e4dce27ca5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f62a06c2933f02c75637172be87adadd015a2aad2750f553bb2e99c38fbec74b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fk9xv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-29rvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:58Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:58 crc kubenswrapper[4698]: I0224 10:17:58.703908 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-mb4d7" Feb 24 10:17:58 crc kubenswrapper[4698]: I0224 10:17:58.713892 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:58 crc kubenswrapper[4698]: I0224 10:17:58.714206 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:58 crc kubenswrapper[4698]: I0224 10:17:58.714246 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:58 crc kubenswrapper[4698]: I0224 10:17:58.714304 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:58 crc kubenswrapper[4698]: I0224 10:17:58.714324 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:58Z","lastTransitionTime":"2026-02-24T10:17:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:58 crc kubenswrapper[4698]: I0224 10:17:58.817716 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:58 crc kubenswrapper[4698]: I0224 10:17:58.817765 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:58 crc kubenswrapper[4698]: I0224 10:17:58.817800 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:58 crc kubenswrapper[4698]: I0224 10:17:58.817816 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:58 crc kubenswrapper[4698]: I0224 10:17:58.817827 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:58Z","lastTransitionTime":"2026-02-24T10:17:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:58 crc kubenswrapper[4698]: I0224 10:17:58.920530 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:58 crc kubenswrapper[4698]: I0224 10:17:58.920579 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:58 crc kubenswrapper[4698]: I0224 10:17:58.920591 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:58 crc kubenswrapper[4698]: I0224 10:17:58.920609 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:58 crc kubenswrapper[4698]: I0224 10:17:58.920621 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:58Z","lastTransitionTime":"2026-02-24T10:17:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:59 crc kubenswrapper[4698]: I0224 10:17:59.023405 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:59 crc kubenswrapper[4698]: I0224 10:17:59.023472 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:59 crc kubenswrapper[4698]: I0224 10:17:59.023494 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:59 crc kubenswrapper[4698]: I0224 10:17:59.023524 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:59 crc kubenswrapper[4698]: I0224 10:17:59.023546 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:59Z","lastTransitionTime":"2026-02-24T10:17:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:59 crc kubenswrapper[4698]: I0224 10:17:59.125870 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:59 crc kubenswrapper[4698]: I0224 10:17:59.125919 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:59 crc kubenswrapper[4698]: I0224 10:17:59.125934 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:59 crc kubenswrapper[4698]: I0224 10:17:59.125957 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:59 crc kubenswrapper[4698]: I0224 10:17:59.125976 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:59Z","lastTransitionTime":"2026-02-24T10:17:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:59 crc kubenswrapper[4698]: I0224 10:17:59.135189 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-mb4d7" event={"ID":"fc3c474c-e869-4b47-94c5-f1ab3ce3c843","Type":"ContainerStarted","Data":"257d5ddc0201337d243bdc881972528ef843cfe6a33026561020382b34c00b3d"} Feb 24 10:17:59 crc kubenswrapper[4698]: I0224 10:17:59.141551 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jlg97" event={"ID":"90062989-bf1b-4479-89a0-f3bf0d438ac3","Type":"ContainerStarted","Data":"dd966a1dd77be4accb00f38133ee9df9a0f98df5050d51996c9547a95c361cfd"} Feb 24 10:17:59 crc kubenswrapper[4698]: I0224 10:17:59.158354 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:59Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:59 crc kubenswrapper[4698]: I0224 10:17:59.177930 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:59Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:59 crc kubenswrapper[4698]: I0224 10:17:59.198375 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4539d49e9935099b59be97e672ffbe6a2a831b9261939a5afba45e16aab5c2dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:59Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:59 crc kubenswrapper[4698]: I0224 10:17:59.215778 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-nn578" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4ee0bb1-125d-4852-a54d-7dadf6177545\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67e08c23594b195088f0a11823556880d9f809097ec231acf6c4ddbcf5c085b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9ngd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0c8bc2bc5ebfb2472863808bf33f95f8aa74ed45b546ed1a1b3be4883af700e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9ngd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-nn578\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:59Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:59 crc kubenswrapper[4698]: I0224 10:17:59.228749 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:59 crc kubenswrapper[4698]: I0224 10:17:59.228802 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:59 crc kubenswrapper[4698]: I0224 10:17:59.228825 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:59 crc kubenswrapper[4698]: I0224 10:17:59.228851 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:59 crc kubenswrapper[4698]: I0224 10:17:59.228869 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:59Z","lastTransitionTime":"2026-02-24T10:17:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:59 crc kubenswrapper[4698]: I0224 10:17:59.245080 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7mbk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17dd9ce8-b1ca-4810-85fe-9775919eb4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac059400b5a17e1f1dc36d2fe35b5c8ace2dad5326f3933873eae644e1786c54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgnjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7mbk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:59Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:59 crc kubenswrapper[4698]: I0224 10:17:59.264311 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b70223850a461f607af8055fb157db676ed4dd9537481c41f21b8b85dc955c6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:59Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:59 crc kubenswrapper[4698]: I0224 10:17:59.291394 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e70623bb6b1c9ba54ae662592cd2861cea4181853f6595a595390c81820c287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://863cee3a2b2acf3e3138d4e13d27a2b4229d619661f97eab920e5a4ee7ae2c51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:59Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:59 crc kubenswrapper[4698]: I0224 10:17:59.319622 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:59Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:59 crc kubenswrapper[4698]: I0224 10:17:59.333005 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:59 crc kubenswrapper[4698]: I0224 10:17:59.333049 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:59 crc kubenswrapper[4698]: I0224 10:17:59.333062 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:59 crc kubenswrapper[4698]: I0224 10:17:59.333080 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:59 crc kubenswrapper[4698]: I0224 10:17:59.333094 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:59Z","lastTransitionTime":"2026-02-24T10:17:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:59 crc kubenswrapper[4698]: I0224 10:17:59.340478 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jlg97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90062989-bf1b-4479-89a0-f3bf0d438ac3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd966a1dd77be4accb00f38133ee9df9a0f98df5050d51996c9547a95c361cfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f570e20898252544de2e4987e3ec3baea2d46904749fc01664c969518d8babd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f570e20898252544de2e4987e3ec3baea2d46904749fc01664c969518d8babd6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86844171c4cdeecffa4831f9bba9b6d9c5eecbcc2220f880ccdb8819df60fa34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86844171c4cdeecffa4831f9bba9b6d9c5eecbcc2220f880ccdb8819df60fa34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42705a048e7832b1de855a97691620e572a7a7f38b90148e1cedd49003c649fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42705a048e7832b1de855a97691620e572a7a7f38b90148e1cedd49003c649fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5968e3b94b9d8996e9c4d4fdfab0576fcee049356dff5defd85f1a71ab652c41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5968e3b94b9d8996e9c4d4fdfab0576fcee049356dff5defd85f1a71ab652c41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05bd18aaa2469fc7380f98a513907e098a1cd45c794dae35894dc4caccaaeac8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05bd18aaa2469fc7380f98a513907e098a1cd45c794dae35894dc4caccaaeac8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c47b55214c6082bb9f8a18705983f9be95ef4c3b557d2d8f6cb8a33fa1fddd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c47b55214c6082bb9f8a18705983f9be95ef4c3b557d2d8f6cb8a33fa1fddd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jlg97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:59Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:59 crc kubenswrapper[4698]: I0224 10:17:59.370894 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"066df704-6981-4770-a647-df52a0da50a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://363eade2263b2108feaaf0620f7f1fd910effb90ce635e5b749b59b407618443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://363eade2263b2108feaaf0620f7f1fd910effb90ce635e5b749b59b407618443\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mgh7p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:59Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:59 crc kubenswrapper[4698]: I0224 10:17:59.373986 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:17:59 crc kubenswrapper[4698]: I0224 10:17:59.374144 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:17:59 crc kubenswrapper[4698]: E0224 10:17:59.374234 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:18:07.374199574 +0000 UTC m=+112.487813825 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:17:59 crc kubenswrapper[4698]: I0224 10:17:59.374349 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:17:59 crc kubenswrapper[4698]: E0224 10:17:59.374375 4698 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 24 10:17:59 crc kubenswrapper[4698]: E0224 10:17:59.374465 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-24 10:18:07.37443643 +0000 UTC m=+112.488050741 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 24 10:17:59 crc kubenswrapper[4698]: E0224 10:17:59.374582 4698 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 24 10:17:59 crc kubenswrapper[4698]: E0224 10:17:59.374641 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-24 10:18:07.374628354 +0000 UTC m=+112.488242715 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 24 10:17:59 crc kubenswrapper[4698]: I0224 10:17:59.383447 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mb4d7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc3c474c-e869-4b47-94c5-f1ab3ce3c843\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:58Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:58Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d8kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mb4d7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:59Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:59 crc kubenswrapper[4698]: I0224 10:17:59.397747 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-29rvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9cba56db-d46e-4a34-9863-47e4dce27ca5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f62a06c2933f02c75637172be87adadd015a2aad2750f553bb2e99c38fbec74b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fk9xv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-29rvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:17:59Z is after 2025-08-24T17:21:41Z" Feb 24 10:17:59 crc kubenswrapper[4698]: I0224 10:17:59.436531 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:59 crc kubenswrapper[4698]: I0224 10:17:59.436599 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:59 crc kubenswrapper[4698]: I0224 10:17:59.436616 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:59 crc kubenswrapper[4698]: I0224 10:17:59.436645 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:59 crc kubenswrapper[4698]: I0224 10:17:59.436663 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:59Z","lastTransitionTime":"2026-02-24T10:17:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:59 crc kubenswrapper[4698]: I0224 10:17:59.475908 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:17:59 crc kubenswrapper[4698]: I0224 10:17:59.476047 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:17:59 crc kubenswrapper[4698]: E0224 10:17:59.476256 4698 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 24 10:17:59 crc kubenswrapper[4698]: E0224 10:17:59.476377 4698 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 24 10:17:59 crc kubenswrapper[4698]: E0224 10:17:59.476402 4698 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 10:17:59 crc kubenswrapper[4698]: E0224 10:17:59.476321 4698 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 24 10:17:59 crc kubenswrapper[4698]: E0224 10:17:59.476498 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-24 10:18:07.476467002 +0000 UTC m=+112.590081433 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 10:17:59 crc kubenswrapper[4698]: E0224 10:17:59.476504 4698 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 24 10:17:59 crc kubenswrapper[4698]: E0224 10:17:59.476539 4698 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 10:17:59 crc kubenswrapper[4698]: E0224 10:17:59.476616 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-24 10:18:07.476588155 +0000 UTC m=+112.590202426 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 10:17:59 crc kubenswrapper[4698]: I0224 10:17:59.540739 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:59 crc kubenswrapper[4698]: I0224 10:17:59.540803 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:59 crc kubenswrapper[4698]: I0224 10:17:59.540824 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:59 crc kubenswrapper[4698]: I0224 10:17:59.540850 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:59 crc kubenswrapper[4698]: I0224 10:17:59.540867 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:59Z","lastTransitionTime":"2026-02-24T10:17:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:59 crc kubenswrapper[4698]: I0224 10:17:59.614196 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:17:59 crc kubenswrapper[4698]: I0224 10:17:59.614355 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:17:59 crc kubenswrapper[4698]: E0224 10:17:59.614864 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:17:59 crc kubenswrapper[4698]: E0224 10:17:59.614976 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:17:59 crc kubenswrapper[4698]: I0224 10:17:59.634599 4698 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 24 10:17:59 crc kubenswrapper[4698]: I0224 10:17:59.636063 4698 scope.go:117] "RemoveContainer" containerID="64b39341e105fbe8aa9dc4c108f6ee8a2bff33568a205e32e639b8382ab2ccb2" Feb 24 10:17:59 crc kubenswrapper[4698]: I0224 10:17:59.643862 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:59 crc kubenswrapper[4698]: I0224 10:17:59.643928 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:59 crc kubenswrapper[4698]: I0224 10:17:59.643946 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:59 crc kubenswrapper[4698]: I0224 10:17:59.643969 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:59 crc kubenswrapper[4698]: I0224 10:17:59.643987 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:59Z","lastTransitionTime":"2026-02-24T10:17:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:59 crc kubenswrapper[4698]: I0224 10:17:59.745617 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:59 crc kubenswrapper[4698]: I0224 10:17:59.745646 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:59 crc kubenswrapper[4698]: I0224 10:17:59.745656 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:59 crc kubenswrapper[4698]: I0224 10:17:59.745672 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:59 crc kubenswrapper[4698]: I0224 10:17:59.745682 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:59Z","lastTransitionTime":"2026-02-24T10:17:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:59 crc kubenswrapper[4698]: I0224 10:17:59.848799 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:59 crc kubenswrapper[4698]: I0224 10:17:59.848874 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:59 crc kubenswrapper[4698]: I0224 10:17:59.848899 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:59 crc kubenswrapper[4698]: I0224 10:17:59.848929 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:59 crc kubenswrapper[4698]: I0224 10:17:59.848954 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:59Z","lastTransitionTime":"2026-02-24T10:17:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:17:59 crc kubenswrapper[4698]: I0224 10:17:59.952954 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:17:59 crc kubenswrapper[4698]: I0224 10:17:59.953036 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:17:59 crc kubenswrapper[4698]: I0224 10:17:59.953058 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:17:59 crc kubenswrapper[4698]: I0224 10:17:59.953089 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:17:59 crc kubenswrapper[4698]: I0224 10:17:59.953113 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:17:59Z","lastTransitionTime":"2026-02-24T10:17:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:00 crc kubenswrapper[4698]: I0224 10:18:00.056343 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:00 crc kubenswrapper[4698]: I0224 10:18:00.056428 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:00 crc kubenswrapper[4698]: I0224 10:18:00.056452 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:00 crc kubenswrapper[4698]: I0224 10:18:00.056478 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:00 crc kubenswrapper[4698]: I0224 10:18:00.056501 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:00Z","lastTransitionTime":"2026-02-24T10:18:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:00 crc kubenswrapper[4698]: I0224 10:18:00.152690 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" event={"ID":"066df704-6981-4770-a647-df52a0da50a0","Type":"ContainerStarted","Data":"98d54a05ba6b01c286162caad21bbde6abb02b6690a6d4d2ade8faefd19a606a"} Feb 24 10:18:00 crc kubenswrapper[4698]: I0224 10:18:00.153214 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" Feb 24 10:18:00 crc kubenswrapper[4698]: I0224 10:18:00.153325 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" Feb 24 10:18:00 crc kubenswrapper[4698]: I0224 10:18:00.153348 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" Feb 24 10:18:00 crc kubenswrapper[4698]: I0224 10:18:00.155975 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-mb4d7" event={"ID":"fc3c474c-e869-4b47-94c5-f1ab3ce3c843","Type":"ContainerStarted","Data":"7d49238acba0219497644e528a1e99906b8e7e5d4a61033354fa8b7b9708b5e8"} Feb 24 10:18:00 crc kubenswrapper[4698]: I0224 10:18:00.160744 4698 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Feb 24 10:18:00 crc kubenswrapper[4698]: I0224 10:18:00.163404 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:00 crc kubenswrapper[4698]: I0224 10:18:00.163455 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:00 crc kubenswrapper[4698]: I0224 10:18:00.163478 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:00 crc kubenswrapper[4698]: I0224 10:18:00.163505 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:00 crc kubenswrapper[4698]: I0224 10:18:00.163522 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:00Z","lastTransitionTime":"2026-02-24T10:18:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:00 crc kubenswrapper[4698]: I0224 10:18:00.165198 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"fa3d4a95fd60ff55d1850deb923135ed607172e7676a141a5d52e6cdd60b23bc"} Feb 24 10:18:00 crc kubenswrapper[4698]: I0224 10:18:00.177332 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:00Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:00 crc kubenswrapper[4698]: I0224 10:18:00.194885 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" Feb 24 10:18:00 crc kubenswrapper[4698]: I0224 10:18:00.196216 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" Feb 24 10:18:00 crc kubenswrapper[4698]: I0224 10:18:00.196159 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-nn578" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4ee0bb1-125d-4852-a54d-7dadf6177545\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67e08c23594b195088f0a11823556880d9f809097ec231acf6c4ddbcf5c085b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9ngd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0c8bc2bc5ebfb2472863808bf33f95f8aa74ed45b546ed1a1b3be4883af700e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9ngd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-nn578\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:00Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:00 crc kubenswrapper[4698]: I0224 10:18:00.221213 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7mbk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17dd9ce8-b1ca-4810-85fe-9775919eb4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac059400b5a17e1f1dc36d2fe35b5c8ace2dad5326f3933873eae644e1786c54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgnjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7mbk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:00Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:00 crc kubenswrapper[4698]: I0224 10:18:00.241197 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:00Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:00 crc kubenswrapper[4698]: I0224 10:18:00.263951 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4539d49e9935099b59be97e672ffbe6a2a831b9261939a5afba45e16aab5c2dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:00Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:00 crc kubenswrapper[4698]: I0224 10:18:00.266479 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:00 crc kubenswrapper[4698]: I0224 10:18:00.266540 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:00 crc kubenswrapper[4698]: I0224 10:18:00.266565 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:00 crc kubenswrapper[4698]: I0224 10:18:00.266590 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:00 crc kubenswrapper[4698]: I0224 10:18:00.266608 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:00Z","lastTransitionTime":"2026-02-24T10:18:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:00 crc kubenswrapper[4698]: I0224 10:18:00.287743 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e70623bb6b1c9ba54ae662592cd2861cea4181853f6595a595390c81820c287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://863cee3a2b2acf3e3138d4e13d27a2b4229d619661f97eab920e5a4ee7ae2c51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:00Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:00 crc kubenswrapper[4698]: I0224 10:18:00.306581 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:00Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:00 crc kubenswrapper[4698]: I0224 10:18:00.337609 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jlg97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90062989-bf1b-4479-89a0-f3bf0d438ac3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd966a1dd77be4accb00f38133ee9df9a0f98df5050d51996c9547a95c361cfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f570e20898252544de2e4987e3ec3baea2d46904749fc01664c969518d8babd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f570e20898252544de2e4987e3ec3baea2d46904749fc01664c969518d8babd6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86844171c4cdeecffa4831f9bba9b6d9c5eecbcc2220f880ccdb8819df60fa34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86844171c4cdeecffa4831f9bba9b6d9c5eecbcc2220f880ccdb8819df60fa34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42705a048e7832b1de855a97691620e572a7a7f38b90148e1cedd49003c649fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42705a048e7832b1de855a97691620e572a7a7f38b90148e1cedd49003c649fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5968e3b94b9d8996e9c4d4fdfab0576fcee049356dff5defd85f1a71ab652c41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5968e3b94b9d8996e9c4d4fdfab0576fcee049356dff5defd85f1a71ab652c41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05bd18aaa2469fc7380f98a513907e098a1cd45c794dae35894dc4caccaaeac8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05bd18aaa2469fc7380f98a513907e098a1cd45c794dae35894dc4caccaaeac8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c47b55214c6082bb9f8a18705983f9be95ef4c3b557d2d8f6cb8a33fa1fddd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c47b55214c6082bb9f8a18705983f9be95ef4c3b557d2d8f6cb8a33fa1fddd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jlg97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:00Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:00 crc kubenswrapper[4698]: I0224 10:18:00.369175 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"066df704-6981-4770-a647-df52a0da50a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60215d9a7dc3fbaa1b045a76c018c910f3748c5bef5325716e0a28844bc91ece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e27ae8c6aa803d58f6ff0252273d2fcbbee794c49a13fc54bfe6677b5aa6e07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7adc5b73bdd01b1e822308534c8848e154a1d05ed5367b971b59a99289387585\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2ec337c851d86c491d1ae5a667e4344ae4759f945b423d3a48838874a6eda20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://444da705b890c795bca82d2bd44ad5b71ed9bcc95a70ee5c92755679af31aa99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://096010abeb5f4fc1cf8ab2a1a3e50000365a449d0747081df923bde1be7e1213\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98d54a05ba6b01c286162caad21bbde6abb02b6690a6d4d2ade8faefd19a606a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1288272246b8937c2880153451d797fc3328749902e2491e60c8f8f086c85288\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://363eade2263b2108feaaf0620f7f1fd910effb90ce635e5b749b59b407618443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://363eade2263b2108feaaf0620f7f1fd910effb90ce635e5b749b59b407618443\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mgh7p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:00Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:00 crc kubenswrapper[4698]: I0224 10:18:00.369790 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:00 crc kubenswrapper[4698]: I0224 10:18:00.369826 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:00 crc kubenswrapper[4698]: I0224 10:18:00.369842 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:00 crc kubenswrapper[4698]: I0224 10:18:00.369869 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:00 crc kubenswrapper[4698]: I0224 10:18:00.369891 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:00Z","lastTransitionTime":"2026-02-24T10:18:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:00 crc kubenswrapper[4698]: I0224 10:18:00.380993 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mb4d7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc3c474c-e869-4b47-94c5-f1ab3ce3c843\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:58Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:58Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d8kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mb4d7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:00Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:00 crc kubenswrapper[4698]: I0224 10:18:00.398092 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b70223850a461f607af8055fb157db676ed4dd9537481c41f21b8b85dc955c6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:00Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:00 crc kubenswrapper[4698]: I0224 10:18:00.413064 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34fd32d5-5aed-4abb-bf14-ab1b8b02b516\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b9d9ca2f4ccd094b55e3e27cef8afddae5dc7de81912aba64ca6a6671f14a35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42a2655047e1fb057b615781d8c2ccf50f62f2a70749ef8bb214d32edaba2b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e1bb75600de7e41c8a04ba010078c753b55d05aae7a18f945c2027ba48ee30c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://64b39341e105fbe8aa9dc4c108f6ee8a2bff33568a205e32e639b8382ab2ccb2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64b39341e105fbe8aa9dc4c108f6ee8a2bff33568a205e32e639b8382ab2ccb2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T10:17:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0224 10:17:08.346350 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0224 10:17:08.346447 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0224 10:17:08.346900 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705878618/tls.crt::/tmp/serving-cert-3705878618/tls.key\\\\\\\"\\\\nI0224 10:17:08.624012 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0224 10:17:08.625525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0224 10:17:08.625540 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0224 10:17:08.625560 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0224 10:17:08.625565 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0224 10:17:08.629654 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0224 10:17:08.629666 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0224 10:17:08.629711 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:17:08.629725 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:17:08.629739 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0224 10:17:08.629749 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0224 10:17:08.629758 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0224 10:17:08.629766 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0224 10:17:08.630467 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://674ed085a7507742c61fdb7dae4678b08e315a3679788c5dcbb4df97cdc27c61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9e1b116db9c76dec99d1ac4af98e5ee081f2a171a19093ba5628b676356f34b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9e1b116db9c76dec99d1ac4af98e5ee081f2a171a19093ba5628b676356f34b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:16:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:16:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:16:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:00Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:00 crc kubenswrapper[4698]: I0224 10:18:00.431497 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-29rvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9cba56db-d46e-4a34-9863-47e4dce27ca5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f62a06c2933f02c75637172be87adadd015a2aad2750f553bb2e99c38fbec74b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fk9xv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-29rvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:00Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:00 crc kubenswrapper[4698]: I0224 10:18:00.450867 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jlg97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90062989-bf1b-4479-89a0-f3bf0d438ac3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd966a1dd77be4accb00f38133ee9df9a0f98df5050d51996c9547a95c361cfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f570e20898252544de2e4987e3ec3baea2d46904749fc01664c969518d8babd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f570e20898252544de2e4987e3ec3baea2d46904749fc01664c969518d8babd6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86844171c4cdeecffa4831f9bba9b6d9c5eecbcc2220f880ccdb8819df60fa34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86844171c4cdeecffa4831f9bba9b6d9c5eecbcc2220f880ccdb8819df60fa34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42705a048e7832b1de855a97691620e572a7a7f38b90148e1cedd49003c649fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42705a048e7832b1de855a97691620e572a7a7f38b90148e1cedd49003c649fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5968e3b94b9d8996e9c4d4fdfab0576fcee049356dff5defd85f1a71ab652c41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5968e3b94b9d8996e9c4d4fdfab0576fcee049356dff5defd85f1a71ab652c41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05bd18aaa2469fc7380f98a513907e098a1cd45c794dae35894dc4caccaaeac8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05bd18aaa2469fc7380f98a513907e098a1cd45c794dae35894dc4caccaaeac8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c47b55214c6082bb9f8a18705983f9be95ef4c3b557d2d8f6cb8a33fa1fddd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c47b55214c6082bb9f8a18705983f9be95ef4c3b557d2d8f6cb8a33fa1fddd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jlg97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:00Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:00 crc kubenswrapper[4698]: I0224 10:18:00.473086 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:00 crc kubenswrapper[4698]: I0224 10:18:00.473390 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:00 crc kubenswrapper[4698]: I0224 10:18:00.473525 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:00 crc kubenswrapper[4698]: I0224 10:18:00.473663 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:00 crc kubenswrapper[4698]: I0224 10:18:00.473803 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:00Z","lastTransitionTime":"2026-02-24T10:18:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:00 crc kubenswrapper[4698]: I0224 10:18:00.476230 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"066df704-6981-4770-a647-df52a0da50a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60215d9a7dc3fbaa1b045a76c018c910f3748c5bef5325716e0a28844bc91ece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e27ae8c6aa803d58f6ff0252273d2fcbbee794c49a13fc54bfe6677b5aa6e07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7adc5b73bdd01b1e822308534c8848e154a1d05ed5367b971b59a99289387585\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2ec337c851d86c491d1ae5a667e4344ae4759f945b423d3a48838874a6eda20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://444da705b890c795bca82d2bd44ad5b71ed9bcc95a70ee5c92755679af31aa99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://096010abeb5f4fc1cf8ab2a1a3e50000365a449d0747081df923bde1be7e1213\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98d54a05ba6b01c286162caad21bbde6abb02b6690a6d4d2ade8faefd19a606a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1288272246b8937c2880153451d797fc3328749902e2491e60c8f8f086c85288\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://363eade2263b2108feaaf0620f7f1fd910effb90ce635e5b749b59b407618443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://363eade2263b2108feaaf0620f7f1fd910effb90ce635e5b749b59b407618443\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mgh7p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:00Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:00 crc kubenswrapper[4698]: I0224 10:18:00.492561 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mb4d7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc3c474c-e869-4b47-94c5-f1ab3ce3c843\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d49238acba0219497644e528a1e99906b8e7e5d4a61033354fa8b7b9708b5e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d8kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mb4d7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:00Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:00 crc kubenswrapper[4698]: I0224 10:18:00.510922 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b70223850a461f607af8055fb157db676ed4dd9537481c41f21b8b85dc955c6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:00Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:00 crc kubenswrapper[4698]: I0224 10:18:00.532391 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e70623bb6b1c9ba54ae662592cd2861cea4181853f6595a595390c81820c287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://863cee3a2b2acf3e3138d4e13d27a2b4229d619661f97eab920e5a4ee7ae2c51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:00Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:00 crc kubenswrapper[4698]: I0224 10:18:00.547524 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:00Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:00 crc kubenswrapper[4698]: I0224 10:18:00.564049 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34fd32d5-5aed-4abb-bf14-ab1b8b02b516\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b9d9ca2f4ccd094b55e3e27cef8afddae5dc7de81912aba64ca6a6671f14a35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42a2655047e1fb057b615781d8c2ccf50f62f2a70749ef8bb214d32edaba2b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e1bb75600de7e41c8a04ba010078c753b55d05aae7a18f945c2027ba48ee30c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa3d4a95fd60ff55d1850deb923135ed607172e7676a141a5d52e6cdd60b23bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64b39341e105fbe8aa9dc4c108f6ee8a2bff33568a205e32e639b8382ab2ccb2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T10:17:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0224 10:17:08.346350 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0224 10:17:08.346447 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0224 10:17:08.346900 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705878618/tls.crt::/tmp/serving-cert-3705878618/tls.key\\\\\\\"\\\\nI0224 10:17:08.624012 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0224 10:17:08.625525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0224 10:17:08.625540 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0224 10:17:08.625560 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0224 10:17:08.625565 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0224 10:17:08.629654 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0224 10:17:08.629666 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0224 10:17:08.629711 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:17:08.629725 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:17:08.629739 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0224 10:17:08.629749 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0224 10:17:08.629758 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0224 10:17:08.629766 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0224 10:17:08.630467 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://674ed085a7507742c61fdb7dae4678b08e315a3679788c5dcbb4df97cdc27c61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9e1b116db9c76dec99d1ac4af98e5ee081f2a171a19093ba5628b676356f34b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9e1b116db9c76dec99d1ac4af98e5ee081f2a171a19093ba5628b676356f34b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:16:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:16:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:16:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:00Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:00 crc kubenswrapper[4698]: I0224 10:18:00.578834 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:00 crc kubenswrapper[4698]: I0224 10:18:00.578945 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:00 crc kubenswrapper[4698]: I0224 10:18:00.578969 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:00 crc kubenswrapper[4698]: I0224 10:18:00.579053 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:00 crc kubenswrapper[4698]: I0224 10:18:00.579126 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:00Z","lastTransitionTime":"2026-02-24T10:18:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:00 crc kubenswrapper[4698]: I0224 10:18:00.600413 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-29rvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9cba56db-d46e-4a34-9863-47e4dce27ca5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f62a06c2933f02c75637172be87adadd015a2aad2750f553bb2e99c38fbec74b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fk9xv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-29rvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:00Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:00 crc kubenswrapper[4698]: I0224 10:18:00.614614 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:18:00 crc kubenswrapper[4698]: E0224 10:18:00.615034 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:18:00 crc kubenswrapper[4698]: I0224 10:18:00.617828 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:00Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:00 crc kubenswrapper[4698]: I0224 10:18:00.636567 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:00Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:00 crc kubenswrapper[4698]: I0224 10:18:00.651113 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4539d49e9935099b59be97e672ffbe6a2a831b9261939a5afba45e16aab5c2dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:00Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:00 crc kubenswrapper[4698]: I0224 10:18:00.661945 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-nn578" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4ee0bb1-125d-4852-a54d-7dadf6177545\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67e08c23594b195088f0a11823556880d9f809097ec231acf6c4ddbcf5c085b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9ngd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0c8bc2bc5ebfb2472863808bf33f95f8aa74ed45b546ed1a1b3be4883af700e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9ngd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-nn578\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:00Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:00 crc kubenswrapper[4698]: I0224 10:18:00.673422 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7mbk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17dd9ce8-b1ca-4810-85fe-9775919eb4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac059400b5a17e1f1dc36d2fe35b5c8ace2dad5326f3933873eae644e1786c54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgnjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7mbk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:00Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:00 crc kubenswrapper[4698]: I0224 10:18:00.682863 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:00 crc kubenswrapper[4698]: I0224 10:18:00.682963 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:00 crc kubenswrapper[4698]: I0224 10:18:00.683041 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:00 crc kubenswrapper[4698]: I0224 10:18:00.683120 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:00 crc kubenswrapper[4698]: I0224 10:18:00.683269 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:00Z","lastTransitionTime":"2026-02-24T10:18:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:00 crc kubenswrapper[4698]: I0224 10:18:00.786137 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:00 crc kubenswrapper[4698]: I0224 10:18:00.786362 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:00 crc kubenswrapper[4698]: I0224 10:18:00.786433 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:00 crc kubenswrapper[4698]: I0224 10:18:00.786508 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:00 crc kubenswrapper[4698]: I0224 10:18:00.786593 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:00Z","lastTransitionTime":"2026-02-24T10:18:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:00 crc kubenswrapper[4698]: I0224 10:18:00.889047 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:00 crc kubenswrapper[4698]: I0224 10:18:00.889339 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:00 crc kubenswrapper[4698]: I0224 10:18:00.889437 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:00 crc kubenswrapper[4698]: I0224 10:18:00.889530 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:00 crc kubenswrapper[4698]: I0224 10:18:00.889605 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:00Z","lastTransitionTime":"2026-02-24T10:18:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:00 crc kubenswrapper[4698]: I0224 10:18:00.991713 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:00 crc kubenswrapper[4698]: I0224 10:18:00.991771 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:00 crc kubenswrapper[4698]: I0224 10:18:00.991786 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:00 crc kubenswrapper[4698]: I0224 10:18:00.991807 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:00 crc kubenswrapper[4698]: I0224 10:18:00.991822 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:00Z","lastTransitionTime":"2026-02-24T10:18:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:01 crc kubenswrapper[4698]: I0224 10:18:01.094973 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:01 crc kubenswrapper[4698]: I0224 10:18:01.095304 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:01 crc kubenswrapper[4698]: I0224 10:18:01.095456 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:01 crc kubenswrapper[4698]: I0224 10:18:01.095581 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:01 crc kubenswrapper[4698]: I0224 10:18:01.095690 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:01Z","lastTransitionTime":"2026-02-24T10:18:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:01 crc kubenswrapper[4698]: I0224 10:18:01.169857 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 10:18:01 crc kubenswrapper[4698]: I0224 10:18:01.199384 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:01 crc kubenswrapper[4698]: I0224 10:18:01.199459 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:01 crc kubenswrapper[4698]: I0224 10:18:01.199483 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:01 crc kubenswrapper[4698]: I0224 10:18:01.199513 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:01 crc kubenswrapper[4698]: I0224 10:18:01.199536 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:01Z","lastTransitionTime":"2026-02-24T10:18:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:01 crc kubenswrapper[4698]: I0224 10:18:01.302099 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:01 crc kubenswrapper[4698]: I0224 10:18:01.302144 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:01 crc kubenswrapper[4698]: I0224 10:18:01.302161 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:01 crc kubenswrapper[4698]: I0224 10:18:01.302181 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:01 crc kubenswrapper[4698]: I0224 10:18:01.302197 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:01Z","lastTransitionTime":"2026-02-24T10:18:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:01 crc kubenswrapper[4698]: I0224 10:18:01.405449 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:01 crc kubenswrapper[4698]: I0224 10:18:01.405503 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:01 crc kubenswrapper[4698]: I0224 10:18:01.405520 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:01 crc kubenswrapper[4698]: I0224 10:18:01.405543 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:01 crc kubenswrapper[4698]: I0224 10:18:01.405560 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:01Z","lastTransitionTime":"2026-02-24T10:18:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:01 crc kubenswrapper[4698]: I0224 10:18:01.508573 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:01 crc kubenswrapper[4698]: I0224 10:18:01.508632 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:01 crc kubenswrapper[4698]: I0224 10:18:01.508654 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:01 crc kubenswrapper[4698]: I0224 10:18:01.508683 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:01 crc kubenswrapper[4698]: I0224 10:18:01.508704 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:01Z","lastTransitionTime":"2026-02-24T10:18:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:01 crc kubenswrapper[4698]: I0224 10:18:01.611352 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:01 crc kubenswrapper[4698]: I0224 10:18:01.611390 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:01 crc kubenswrapper[4698]: I0224 10:18:01.611401 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:01 crc kubenswrapper[4698]: I0224 10:18:01.611418 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:01 crc kubenswrapper[4698]: I0224 10:18:01.611429 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:01Z","lastTransitionTime":"2026-02-24T10:18:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:01 crc kubenswrapper[4698]: I0224 10:18:01.614322 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:18:01 crc kubenswrapper[4698]: E0224 10:18:01.614434 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:18:01 crc kubenswrapper[4698]: I0224 10:18:01.614567 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:18:01 crc kubenswrapper[4698]: E0224 10:18:01.614795 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:18:01 crc kubenswrapper[4698]: I0224 10:18:01.714956 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:01 crc kubenswrapper[4698]: I0224 10:18:01.715026 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:01 crc kubenswrapper[4698]: I0224 10:18:01.715050 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:01 crc kubenswrapper[4698]: I0224 10:18:01.715084 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:01 crc kubenswrapper[4698]: I0224 10:18:01.715153 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:01Z","lastTransitionTime":"2026-02-24T10:18:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:01 crc kubenswrapper[4698]: I0224 10:18:01.822465 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:01 crc kubenswrapper[4698]: I0224 10:18:01.822821 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:01 crc kubenswrapper[4698]: I0224 10:18:01.822952 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:01 crc kubenswrapper[4698]: I0224 10:18:01.823105 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:01 crc kubenswrapper[4698]: I0224 10:18:01.823240 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:01Z","lastTransitionTime":"2026-02-24T10:18:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:01 crc kubenswrapper[4698]: I0224 10:18:01.926437 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:01 crc kubenswrapper[4698]: I0224 10:18:01.926548 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:01 crc kubenswrapper[4698]: I0224 10:18:01.926567 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:01 crc kubenswrapper[4698]: I0224 10:18:01.926592 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:01 crc kubenswrapper[4698]: I0224 10:18:01.926628 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:01Z","lastTransitionTime":"2026-02-24T10:18:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:02 crc kubenswrapper[4698]: I0224 10:18:02.029209 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:02 crc kubenswrapper[4698]: I0224 10:18:02.029513 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:02 crc kubenswrapper[4698]: I0224 10:18:02.029598 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:02 crc kubenswrapper[4698]: I0224 10:18:02.029678 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:02 crc kubenswrapper[4698]: I0224 10:18:02.029749 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:02Z","lastTransitionTime":"2026-02-24T10:18:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:02 crc kubenswrapper[4698]: I0224 10:18:02.131807 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:02 crc kubenswrapper[4698]: I0224 10:18:02.131865 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:02 crc kubenswrapper[4698]: I0224 10:18:02.131883 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:02 crc kubenswrapper[4698]: I0224 10:18:02.131908 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:02 crc kubenswrapper[4698]: I0224 10:18:02.131925 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:02Z","lastTransitionTime":"2026-02-24T10:18:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:02 crc kubenswrapper[4698]: I0224 10:18:02.234016 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:02 crc kubenswrapper[4698]: I0224 10:18:02.234079 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:02 crc kubenswrapper[4698]: I0224 10:18:02.234097 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:02 crc kubenswrapper[4698]: I0224 10:18:02.234125 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:02 crc kubenswrapper[4698]: I0224 10:18:02.234143 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:02Z","lastTransitionTime":"2026-02-24T10:18:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:02 crc kubenswrapper[4698]: I0224 10:18:02.336486 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:02 crc kubenswrapper[4698]: I0224 10:18:02.336542 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:02 crc kubenswrapper[4698]: I0224 10:18:02.336560 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:02 crc kubenswrapper[4698]: I0224 10:18:02.336585 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:02 crc kubenswrapper[4698]: I0224 10:18:02.336604 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:02Z","lastTransitionTime":"2026-02-24T10:18:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:02 crc kubenswrapper[4698]: I0224 10:18:02.439651 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:02 crc kubenswrapper[4698]: I0224 10:18:02.439690 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:02 crc kubenswrapper[4698]: I0224 10:18:02.439698 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:02 crc kubenswrapper[4698]: I0224 10:18:02.439712 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:02 crc kubenswrapper[4698]: I0224 10:18:02.439721 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:02Z","lastTransitionTime":"2026-02-24T10:18:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:02 crc kubenswrapper[4698]: I0224 10:18:02.542646 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:02 crc kubenswrapper[4698]: I0224 10:18:02.543803 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:02 crc kubenswrapper[4698]: I0224 10:18:02.544039 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:02 crc kubenswrapper[4698]: I0224 10:18:02.544204 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:02 crc kubenswrapper[4698]: I0224 10:18:02.544368 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:02Z","lastTransitionTime":"2026-02-24T10:18:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:02 crc kubenswrapper[4698]: I0224 10:18:02.613743 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:18:02 crc kubenswrapper[4698]: E0224 10:18:02.613853 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:18:02 crc kubenswrapper[4698]: I0224 10:18:02.647457 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:02 crc kubenswrapper[4698]: I0224 10:18:02.647515 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:02 crc kubenswrapper[4698]: I0224 10:18:02.647532 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:02 crc kubenswrapper[4698]: I0224 10:18:02.647557 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:02 crc kubenswrapper[4698]: I0224 10:18:02.647573 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:02Z","lastTransitionTime":"2026-02-24T10:18:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:02 crc kubenswrapper[4698]: I0224 10:18:02.749946 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:02 crc kubenswrapper[4698]: I0224 10:18:02.750013 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:02 crc kubenswrapper[4698]: I0224 10:18:02.750031 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:02 crc kubenswrapper[4698]: I0224 10:18:02.750055 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:02 crc kubenswrapper[4698]: I0224 10:18:02.750072 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:02Z","lastTransitionTime":"2026-02-24T10:18:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:02 crc kubenswrapper[4698]: I0224 10:18:02.854357 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:02 crc kubenswrapper[4698]: I0224 10:18:02.854736 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:02 crc kubenswrapper[4698]: I0224 10:18:02.854754 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:02 crc kubenswrapper[4698]: I0224 10:18:02.854780 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:02 crc kubenswrapper[4698]: I0224 10:18:02.854798 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:02Z","lastTransitionTime":"2026-02-24T10:18:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:02 crc kubenswrapper[4698]: I0224 10:18:02.956808 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:02 crc kubenswrapper[4698]: I0224 10:18:02.956911 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:02 crc kubenswrapper[4698]: I0224 10:18:02.956966 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:02 crc kubenswrapper[4698]: I0224 10:18:02.956992 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:02 crc kubenswrapper[4698]: I0224 10:18:02.957009 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:02Z","lastTransitionTime":"2026-02-24T10:18:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:03 crc kubenswrapper[4698]: I0224 10:18:03.060543 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:03 crc kubenswrapper[4698]: I0224 10:18:03.060672 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:03 crc kubenswrapper[4698]: I0224 10:18:03.060690 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:03 crc kubenswrapper[4698]: I0224 10:18:03.060718 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:03 crc kubenswrapper[4698]: I0224 10:18:03.060735 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:03Z","lastTransitionTime":"2026-02-24T10:18:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:03 crc kubenswrapper[4698]: I0224 10:18:03.163893 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:03 crc kubenswrapper[4698]: I0224 10:18:03.163963 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:03 crc kubenswrapper[4698]: I0224 10:18:03.163981 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:03 crc kubenswrapper[4698]: I0224 10:18:03.164009 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:03 crc kubenswrapper[4698]: I0224 10:18:03.164039 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:03Z","lastTransitionTime":"2026-02-24T10:18:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:03 crc kubenswrapper[4698]: I0224 10:18:03.177622 4698 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mgh7p_066df704-6981-4770-a647-df52a0da50a0/ovnkube-controller/0.log" Feb 24 10:18:03 crc kubenswrapper[4698]: I0224 10:18:03.182215 4698 generic.go:334] "Generic (PLEG): container finished" podID="066df704-6981-4770-a647-df52a0da50a0" containerID="98d54a05ba6b01c286162caad21bbde6abb02b6690a6d4d2ade8faefd19a606a" exitCode=1 Feb 24 10:18:03 crc kubenswrapper[4698]: I0224 10:18:03.182301 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" event={"ID":"066df704-6981-4770-a647-df52a0da50a0","Type":"ContainerDied","Data":"98d54a05ba6b01c286162caad21bbde6abb02b6690a6d4d2ade8faefd19a606a"} Feb 24 10:18:03 crc kubenswrapper[4698]: I0224 10:18:03.183793 4698 scope.go:117] "RemoveContainer" containerID="98d54a05ba6b01c286162caad21bbde6abb02b6690a6d4d2ade8faefd19a606a" Feb 24 10:18:03 crc kubenswrapper[4698]: I0224 10:18:03.209713 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:03Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:03 crc kubenswrapper[4698]: I0224 10:18:03.236721 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4539d49e9935099b59be97e672ffbe6a2a831b9261939a5afba45e16aab5c2dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:03Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:03 crc kubenswrapper[4698]: I0224 10:18:03.259011 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-nn578" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4ee0bb1-125d-4852-a54d-7dadf6177545\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67e08c23594b195088f0a11823556880d9f809097ec231acf6c4ddbcf5c085b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9ngd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0c8bc2bc5ebfb2472863808bf33f95f8aa74ed45b546ed1a1b3be4883af700e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9ngd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-nn578\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:03Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:03 crc kubenswrapper[4698]: I0224 10:18:03.270317 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:03 crc kubenswrapper[4698]: I0224 10:18:03.270577 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:03 crc kubenswrapper[4698]: I0224 10:18:03.270850 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:03 crc kubenswrapper[4698]: I0224 10:18:03.271138 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:03 crc kubenswrapper[4698]: I0224 10:18:03.271362 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:03Z","lastTransitionTime":"2026-02-24T10:18:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:03 crc kubenswrapper[4698]: I0224 10:18:03.281503 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7mbk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17dd9ce8-b1ca-4810-85fe-9775919eb4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac059400b5a17e1f1dc36d2fe35b5c8ace2dad5326f3933873eae644e1786c54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgnjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7mbk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:03Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:03 crc kubenswrapper[4698]: I0224 10:18:03.307990 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jlg97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90062989-bf1b-4479-89a0-f3bf0d438ac3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd966a1dd77be4accb00f38133ee9df9a0f98df5050d51996c9547a95c361cfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f570e20898252544de2e4987e3ec3baea2d46904749fc01664c969518d8babd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f570e20898252544de2e4987e3ec3baea2d46904749fc01664c969518d8babd6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86844171c4cdeecffa4831f9bba9b6d9c5eecbcc2220f880ccdb8819df60fa34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86844171c4cdeecffa4831f9bba9b6d9c5eecbcc2220f880ccdb8819df60fa34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42705a048e7832b1de855a97691620e572a7a7f38b90148e1cedd49003c649fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42705a048e7832b1de855a97691620e572a7a7f38b90148e1cedd49003c649fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5968e3b94b9d8996e9c4d4fdfab0576fcee049356dff5defd85f1a71ab652c41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5968e3b94b9d8996e9c4d4fdfab0576fcee049356dff5defd85f1a71ab652c41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05bd18aaa2469fc7380f98a513907e098a1cd45c794dae35894dc4caccaaeac8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05bd18aaa2469fc7380f98a513907e098a1cd45c794dae35894dc4caccaaeac8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c47b55214c6082bb9f8a18705983f9be95ef4c3b557d2d8f6cb8a33fa1fddd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c47b55214c6082bb9f8a18705983f9be95ef4c3b557d2d8f6cb8a33fa1fddd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jlg97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:03Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:03 crc kubenswrapper[4698]: I0224 10:18:03.338482 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"066df704-6981-4770-a647-df52a0da50a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60215d9a7dc3fbaa1b045a76c018c910f3748c5bef5325716e0a28844bc91ece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e27ae8c6aa803d58f6ff0252273d2fcbbee794c49a13fc54bfe6677b5aa6e07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7adc5b73bdd01b1e822308534c8848e154a1d05ed5367b971b59a99289387585\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2ec337c851d86c491d1ae5a667e4344ae4759f945b423d3a48838874a6eda20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://444da705b890c795bca82d2bd44ad5b71ed9bcc95a70ee5c92755679af31aa99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://096010abeb5f4fc1cf8ab2a1a3e50000365a449d0747081df923bde1be7e1213\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://98d54a05ba6b01c286162caad21bbde6abb02b6690a6d4d2ade8faefd19a606a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98d54a05ba6b01c286162caad21bbde6abb02b6690a6d4d2ade8faefd19a606a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-24T10:18:03Z\\\",\\\"message\\\":\\\":311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0224 10:18:02.975081 6372 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0224 10:18:02.975197 6372 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0224 10:18:02.977432 6372 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0224 10:18:02.977496 6372 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0224 10:18:02.977502 6372 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0224 10:18:02.977524 6372 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0224 10:18:02.977556 6372 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0224 10:18:02.977595 6372 factory.go:656] Stopping watch factory\\\\nI0224 10:18:02.977609 6372 ovnkube.go:599] Stopped ovnkube\\\\nI0224 10:18:02.977666 6372 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0224 10:18:02.977676 6372 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0224 10:18:02.977693 6372 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0224 10:18:02.977701 6372 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0224 10:18:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1288272246b8937c2880153451d797fc3328749902e2491e60c8f8f086c85288\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://363eade2263b2108feaaf0620f7f1fd910effb90ce635e5b749b59b407618443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://363eade2263b2108feaaf0620f7f1fd910effb90ce635e5b749b59b407618443\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mgh7p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:03Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:03 crc kubenswrapper[4698]: I0224 10:18:03.354693 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mb4d7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc3c474c-e869-4b47-94c5-f1ab3ce3c843\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d49238acba0219497644e528a1e99906b8e7e5d4a61033354fa8b7b9708b5e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d8kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mb4d7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:03Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:03 crc kubenswrapper[4698]: I0224 10:18:03.373882 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b70223850a461f607af8055fb157db676ed4dd9537481c41f21b8b85dc955c6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:03Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:03 crc kubenswrapper[4698]: I0224 10:18:03.375073 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:03 crc kubenswrapper[4698]: I0224 10:18:03.375117 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:03 crc kubenswrapper[4698]: I0224 10:18:03.375133 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:03 crc kubenswrapper[4698]: I0224 10:18:03.375158 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:03 crc kubenswrapper[4698]: I0224 10:18:03.375175 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:03Z","lastTransitionTime":"2026-02-24T10:18:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:03 crc kubenswrapper[4698]: I0224 10:18:03.391677 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e70623bb6b1c9ba54ae662592cd2861cea4181853f6595a595390c81820c287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://863cee3a2b2acf3e3138d4e13d27a2b4229d619661f97eab920e5a4ee7ae2c51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:03Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:03 crc kubenswrapper[4698]: I0224 10:18:03.412979 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:03Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:03 crc kubenswrapper[4698]: I0224 10:18:03.433148 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34fd32d5-5aed-4abb-bf14-ab1b8b02b516\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b9d9ca2f4ccd094b55e3e27cef8afddae5dc7de81912aba64ca6a6671f14a35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42a2655047e1fb057b615781d8c2ccf50f62f2a70749ef8bb214d32edaba2b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e1bb75600de7e41c8a04ba010078c753b55d05aae7a18f945c2027ba48ee30c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa3d4a95fd60ff55d1850deb923135ed607172e7676a141a5d52e6cdd60b23bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64b39341e105fbe8aa9dc4c108f6ee8a2bff33568a205e32e639b8382ab2ccb2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T10:17:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0224 10:17:08.346350 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0224 10:17:08.346447 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0224 10:17:08.346900 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705878618/tls.crt::/tmp/serving-cert-3705878618/tls.key\\\\\\\"\\\\nI0224 10:17:08.624012 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0224 10:17:08.625525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0224 10:17:08.625540 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0224 10:17:08.625560 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0224 10:17:08.625565 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0224 10:17:08.629654 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0224 10:17:08.629666 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0224 10:17:08.629711 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:17:08.629725 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:17:08.629739 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0224 10:17:08.629749 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0224 10:17:08.629758 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0224 10:17:08.629766 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0224 10:17:08.630467 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://674ed085a7507742c61fdb7dae4678b08e315a3679788c5dcbb4df97cdc27c61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9e1b116db9c76dec99d1ac4af98e5ee081f2a171a19093ba5628b676356f34b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9e1b116db9c76dec99d1ac4af98e5ee081f2a171a19093ba5628b676356f34b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:16:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:16:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:16:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:03Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:03 crc kubenswrapper[4698]: I0224 10:18:03.447936 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-29rvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9cba56db-d46e-4a34-9863-47e4dce27ca5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f62a06c2933f02c75637172be87adadd015a2aad2750f553bb2e99c38fbec74b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fk9xv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-29rvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:03Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:03 crc kubenswrapper[4698]: I0224 10:18:03.466315 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:03Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:03 crc kubenswrapper[4698]: I0224 10:18:03.477323 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:03 crc kubenswrapper[4698]: I0224 10:18:03.477362 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:03 crc kubenswrapper[4698]: I0224 10:18:03.477370 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:03 crc kubenswrapper[4698]: I0224 10:18:03.477385 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:03 crc kubenswrapper[4698]: I0224 10:18:03.477394 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:03Z","lastTransitionTime":"2026-02-24T10:18:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:03 crc kubenswrapper[4698]: I0224 10:18:03.580057 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:03 crc kubenswrapper[4698]: I0224 10:18:03.580110 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:03 crc kubenswrapper[4698]: I0224 10:18:03.580128 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:03 crc kubenswrapper[4698]: I0224 10:18:03.580150 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:03 crc kubenswrapper[4698]: I0224 10:18:03.580169 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:03Z","lastTransitionTime":"2026-02-24T10:18:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:03 crc kubenswrapper[4698]: I0224 10:18:03.615578 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:18:03 crc kubenswrapper[4698]: I0224 10:18:03.615705 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:18:03 crc kubenswrapper[4698]: E0224 10:18:03.615779 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:18:03 crc kubenswrapper[4698]: E0224 10:18:03.615892 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:18:03 crc kubenswrapper[4698]: I0224 10:18:03.682405 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:03 crc kubenswrapper[4698]: I0224 10:18:03.682453 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:03 crc kubenswrapper[4698]: I0224 10:18:03.682470 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:03 crc kubenswrapper[4698]: I0224 10:18:03.682492 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:03 crc kubenswrapper[4698]: I0224 10:18:03.682509 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:03Z","lastTransitionTime":"2026-02-24T10:18:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:03 crc kubenswrapper[4698]: I0224 10:18:03.786111 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:03 crc kubenswrapper[4698]: I0224 10:18:03.786178 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:03 crc kubenswrapper[4698]: I0224 10:18:03.786196 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:03 crc kubenswrapper[4698]: I0224 10:18:03.786222 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:03 crc kubenswrapper[4698]: I0224 10:18:03.786240 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:03Z","lastTransitionTime":"2026-02-24T10:18:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:03 crc kubenswrapper[4698]: I0224 10:18:03.888765 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:03 crc kubenswrapper[4698]: I0224 10:18:03.888824 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:03 crc kubenswrapper[4698]: I0224 10:18:03.888843 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:03 crc kubenswrapper[4698]: I0224 10:18:03.888919 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:03 crc kubenswrapper[4698]: I0224 10:18:03.888938 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:03Z","lastTransitionTime":"2026-02-24T10:18:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:03 crc kubenswrapper[4698]: I0224 10:18:03.991915 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:03 crc kubenswrapper[4698]: I0224 10:18:03.991965 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:03 crc kubenswrapper[4698]: I0224 10:18:03.991983 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:03 crc kubenswrapper[4698]: I0224 10:18:03.992007 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:03 crc kubenswrapper[4698]: I0224 10:18:03.992026 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:03Z","lastTransitionTime":"2026-02-24T10:18:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:04 crc kubenswrapper[4698]: I0224 10:18:04.095005 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:04 crc kubenswrapper[4698]: I0224 10:18:04.095087 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:04 crc kubenswrapper[4698]: I0224 10:18:04.095114 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:04 crc kubenswrapper[4698]: I0224 10:18:04.095147 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:04 crc kubenswrapper[4698]: I0224 10:18:04.095171 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:04Z","lastTransitionTime":"2026-02-24T10:18:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:04 crc kubenswrapper[4698]: I0224 10:18:04.188671 4698 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mgh7p_066df704-6981-4770-a647-df52a0da50a0/ovnkube-controller/0.log" Feb 24 10:18:04 crc kubenswrapper[4698]: I0224 10:18:04.192296 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" event={"ID":"066df704-6981-4770-a647-df52a0da50a0","Type":"ContainerStarted","Data":"4c4183b7a2d42eded3a4a62df9ef06d127a9a288a8e51010277cb370cf9d019e"} Feb 24 10:18:04 crc kubenswrapper[4698]: I0224 10:18:04.192885 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" Feb 24 10:18:04 crc kubenswrapper[4698]: I0224 10:18:04.197986 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:04 crc kubenswrapper[4698]: I0224 10:18:04.198020 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:04 crc kubenswrapper[4698]: I0224 10:18:04.198032 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:04 crc kubenswrapper[4698]: I0224 10:18:04.198049 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:04 crc kubenswrapper[4698]: I0224 10:18:04.198062 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:04Z","lastTransitionTime":"2026-02-24T10:18:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:04 crc kubenswrapper[4698]: I0224 10:18:04.232885 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34fd32d5-5aed-4abb-bf14-ab1b8b02b516\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b9d9ca2f4ccd094b55e3e27cef8afddae5dc7de81912aba64ca6a6671f14a35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42a2655047e1fb057b615781d8c2ccf50f62f2a70749ef8bb214d32edaba2b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e1bb75600de7e41c8a04ba010078c753b55d05aae7a18f945c2027ba48ee30c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa3d4a95fd60ff55d1850deb923135ed607172e7676a141a5d52e6cdd60b23bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64b39341e105fbe8aa9dc4c108f6ee8a2bff33568a205e32e639b8382ab2ccb2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T10:17:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0224 10:17:08.346350 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0224 10:17:08.346447 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0224 10:17:08.346900 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705878618/tls.crt::/tmp/serving-cert-3705878618/tls.key\\\\\\\"\\\\nI0224 10:17:08.624012 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0224 10:17:08.625525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0224 10:17:08.625540 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0224 10:17:08.625560 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0224 10:17:08.625565 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0224 10:17:08.629654 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0224 10:17:08.629666 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0224 10:17:08.629711 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:17:08.629725 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:17:08.629739 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0224 10:17:08.629749 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0224 10:17:08.629758 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0224 10:17:08.629766 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0224 10:17:08.630467 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://674ed085a7507742c61fdb7dae4678b08e315a3679788c5dcbb4df97cdc27c61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9e1b116db9c76dec99d1ac4af98e5ee081f2a171a19093ba5628b676356f34b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9e1b116db9c76dec99d1ac4af98e5ee081f2a171a19093ba5628b676356f34b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:16:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:16:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:16:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:04Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:04 crc kubenswrapper[4698]: I0224 10:18:04.249932 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-29rvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9cba56db-d46e-4a34-9863-47e4dce27ca5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f62a06c2933f02c75637172be87adadd015a2aad2750f553bb2e99c38fbec74b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fk9xv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-29rvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:04Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:04 crc kubenswrapper[4698]: I0224 10:18:04.267971 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:04Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:04 crc kubenswrapper[4698]: I0224 10:18:04.283238 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:04Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:04 crc kubenswrapper[4698]: I0224 10:18:04.300197 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:04 crc kubenswrapper[4698]: I0224 10:18:04.300244 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:04 crc kubenswrapper[4698]: I0224 10:18:04.300274 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:04 crc kubenswrapper[4698]: I0224 10:18:04.300292 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:04 crc kubenswrapper[4698]: I0224 10:18:04.300304 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:04Z","lastTransitionTime":"2026-02-24T10:18:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:04 crc kubenswrapper[4698]: I0224 10:18:04.310214 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4539d49e9935099b59be97e672ffbe6a2a831b9261939a5afba45e16aab5c2dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:04Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:04 crc kubenswrapper[4698]: I0224 10:18:04.325274 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-nn578" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4ee0bb1-125d-4852-a54d-7dadf6177545\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67e08c23594b195088f0a11823556880d9f809097ec231acf6c4ddbcf5c085b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9ngd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0c8bc2bc5ebfb2472863808bf33f95f8aa74ed45b546ed1a1b3be4883af700e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9ngd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-nn578\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:04Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:04 crc kubenswrapper[4698]: I0224 10:18:04.343391 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7mbk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17dd9ce8-b1ca-4810-85fe-9775919eb4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac059400b5a17e1f1dc36d2fe35b5c8ace2dad5326f3933873eae644e1786c54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgnjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7mbk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:04Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:04 crc kubenswrapper[4698]: I0224 10:18:04.357682 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b70223850a461f607af8055fb157db676ed4dd9537481c41f21b8b85dc955c6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:04Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:04 crc kubenswrapper[4698]: I0224 10:18:04.376832 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e70623bb6b1c9ba54ae662592cd2861cea4181853f6595a595390c81820c287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://863cee3a2b2acf3e3138d4e13d27a2b4229d619661f97eab920e5a4ee7ae2c51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:04Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:04 crc kubenswrapper[4698]: I0224 10:18:04.392502 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:04Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:04 crc kubenswrapper[4698]: I0224 10:18:04.403097 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:04 crc kubenswrapper[4698]: I0224 10:18:04.403137 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:04 crc kubenswrapper[4698]: I0224 10:18:04.403150 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:04 crc kubenswrapper[4698]: I0224 10:18:04.403167 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:04 crc kubenswrapper[4698]: I0224 10:18:04.403183 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:04Z","lastTransitionTime":"2026-02-24T10:18:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:04 crc kubenswrapper[4698]: I0224 10:18:04.417129 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jlg97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90062989-bf1b-4479-89a0-f3bf0d438ac3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd966a1dd77be4accb00f38133ee9df9a0f98df5050d51996c9547a95c361cfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f570e20898252544de2e4987e3ec3baea2d46904749fc01664c969518d8babd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f570e20898252544de2e4987e3ec3baea2d46904749fc01664c969518d8babd6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86844171c4cdeecffa4831f9bba9b6d9c5eecbcc2220f880ccdb8819df60fa34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86844171c4cdeecffa4831f9bba9b6d9c5eecbcc2220f880ccdb8819df60fa34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42705a048e7832b1de855a97691620e572a7a7f38b90148e1cedd49003c649fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42705a048e7832b1de855a97691620e572a7a7f38b90148e1cedd49003c649fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5968e3b94b9d8996e9c4d4fdfab0576fcee049356dff5defd85f1a71ab652c41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5968e3b94b9d8996e9c4d4fdfab0576fcee049356dff5defd85f1a71ab652c41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05bd18aaa2469fc7380f98a513907e098a1cd45c794dae35894dc4caccaaeac8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05bd18aaa2469fc7380f98a513907e098a1cd45c794dae35894dc4caccaaeac8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c47b55214c6082bb9f8a18705983f9be95ef4c3b557d2d8f6cb8a33fa1fddd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c47b55214c6082bb9f8a18705983f9be95ef4c3b557d2d8f6cb8a33fa1fddd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jlg97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:04Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:04 crc kubenswrapper[4698]: I0224 10:18:04.441888 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"066df704-6981-4770-a647-df52a0da50a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60215d9a7dc3fbaa1b045a76c018c910f3748c5bef5325716e0a28844bc91ece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e27ae8c6aa803d58f6ff0252273d2fcbbee794c49a13fc54bfe6677b5aa6e07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7adc5b73bdd01b1e822308534c8848e154a1d05ed5367b971b59a99289387585\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2ec337c851d86c491d1ae5a667e4344ae4759f945b423d3a48838874a6eda20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://444da705b890c795bca82d2bd44ad5b71ed9bcc95a70ee5c92755679af31aa99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://096010abeb5f4fc1cf8ab2a1a3e50000365a449d0747081df923bde1be7e1213\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c4183b7a2d42eded3a4a62df9ef06d127a9a288a8e51010277cb370cf9d019e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98d54a05ba6b01c286162caad21bbde6abb02b6690a6d4d2ade8faefd19a606a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-24T10:18:03Z\\\",\\\"message\\\":\\\":311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0224 10:18:02.975081 6372 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0224 10:18:02.975197 6372 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0224 10:18:02.977432 6372 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0224 10:18:02.977496 6372 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0224 10:18:02.977502 6372 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0224 10:18:02.977524 6372 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0224 10:18:02.977556 6372 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0224 10:18:02.977595 6372 factory.go:656] Stopping watch factory\\\\nI0224 10:18:02.977609 6372 ovnkube.go:599] Stopped ovnkube\\\\nI0224 10:18:02.977666 6372 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0224 10:18:02.977676 6372 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0224 10:18:02.977693 6372 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0224 10:18:02.977701 6372 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0224 10:18:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1288272246b8937c2880153451d797fc3328749902e2491e60c8f8f086c85288\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://363eade2263b2108feaaf0620f7f1fd910effb90ce635e5b749b59b407618443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://363eade2263b2108feaaf0620f7f1fd910effb90ce635e5b749b59b407618443\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mgh7p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:04Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:04 crc kubenswrapper[4698]: I0224 10:18:04.454816 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mb4d7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc3c474c-e869-4b47-94c5-f1ab3ce3c843\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d49238acba0219497644e528a1e99906b8e7e5d4a61033354fa8b7b9708b5e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d8kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mb4d7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:04Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:04 crc kubenswrapper[4698]: I0224 10:18:04.504237 4698 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bhrhk"] Feb 24 10:18:04 crc kubenswrapper[4698]: I0224 10:18:04.504914 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bhrhk" Feb 24 10:18:04 crc kubenswrapper[4698]: I0224 10:18:04.505405 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:04 crc kubenswrapper[4698]: I0224 10:18:04.505442 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:04 crc kubenswrapper[4698]: I0224 10:18:04.505453 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:04 crc kubenswrapper[4698]: I0224 10:18:04.505538 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:04 crc kubenswrapper[4698]: I0224 10:18:04.505555 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:04Z","lastTransitionTime":"2026-02-24T10:18:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:04 crc kubenswrapper[4698]: I0224 10:18:04.507799 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 24 10:18:04 crc kubenswrapper[4698]: I0224 10:18:04.508059 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 24 10:18:04 crc kubenswrapper[4698]: I0224 10:18:04.528414 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34fd32d5-5aed-4abb-bf14-ab1b8b02b516\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b9d9ca2f4ccd094b55e3e27cef8afddae5dc7de81912aba64ca6a6671f14a35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42a2655047e1fb057b615781d8c2ccf50f62f2a70749ef8bb214d32edaba2b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e1bb75600de7e41c8a04ba010078c753b55d05aae7a18f945c2027ba48ee30c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa3d4a95fd60ff55d1850deb923135ed607172e7676a141a5d52e6cdd60b23bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64b39341e105fbe8aa9dc4c108f6ee8a2bff33568a205e32e639b8382ab2ccb2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T10:17:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0224 10:17:08.346350 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0224 10:17:08.346447 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0224 10:17:08.346900 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705878618/tls.crt::/tmp/serving-cert-3705878618/tls.key\\\\\\\"\\\\nI0224 10:17:08.624012 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0224 10:17:08.625525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0224 10:17:08.625540 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0224 10:17:08.625560 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0224 10:17:08.625565 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0224 10:17:08.629654 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0224 10:17:08.629666 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0224 10:17:08.629711 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:17:08.629725 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:17:08.629739 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0224 10:17:08.629749 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0224 10:17:08.629758 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0224 10:17:08.629766 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0224 10:17:08.630467 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://674ed085a7507742c61fdb7dae4678b08e315a3679788c5dcbb4df97cdc27c61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9e1b116db9c76dec99d1ac4af98e5ee081f2a171a19093ba5628b676356f34b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9e1b116db9c76dec99d1ac4af98e5ee081f2a171a19093ba5628b676356f34b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:16:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:16:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:16:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:04Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:04 crc kubenswrapper[4698]: I0224 10:18:04.544342 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-29rvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9cba56db-d46e-4a34-9863-47e4dce27ca5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f62a06c2933f02c75637172be87adadd015a2aad2750f553bb2e99c38fbec74b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fk9xv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-29rvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:04Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:04 crc kubenswrapper[4698]: I0224 10:18:04.559730 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:04Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:04 crc kubenswrapper[4698]: I0224 10:18:04.575204 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-nn578" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4ee0bb1-125d-4852-a54d-7dadf6177545\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67e08c23594b195088f0a11823556880d9f809097ec231acf6c4ddbcf5c085b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9ngd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0c8bc2bc5ebfb2472863808bf33f95f8aa74ed45b546ed1a1b3be4883af700e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9ngd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-nn578\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:04Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:04 crc kubenswrapper[4698]: I0224 10:18:04.591123 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7mbk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17dd9ce8-b1ca-4810-85fe-9775919eb4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac059400b5a17e1f1dc36d2fe35b5c8ace2dad5326f3933873eae644e1786c54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgnjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7mbk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:04Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:04 crc kubenswrapper[4698]: I0224 10:18:04.608194 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:04 crc kubenswrapper[4698]: I0224 10:18:04.608250 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:04 crc kubenswrapper[4698]: I0224 10:18:04.608291 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:04 crc kubenswrapper[4698]: I0224 10:18:04.608313 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:04 crc kubenswrapper[4698]: I0224 10:18:04.608328 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:04Z","lastTransitionTime":"2026-02-24T10:18:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:04 crc kubenswrapper[4698]: I0224 10:18:04.608688 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:04Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:04 crc kubenswrapper[4698]: I0224 10:18:04.614357 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:18:04 crc kubenswrapper[4698]: E0224 10:18:04.614584 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:18:04 crc kubenswrapper[4698]: I0224 10:18:04.627571 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f6d24b42-65c5-4a01-8f4a-6f970714ab76-env-overrides\") pod \"ovnkube-control-plane-749d76644c-bhrhk\" (UID: \"f6d24b42-65c5-4a01-8f4a-6f970714ab76\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bhrhk" Feb 24 10:18:04 crc kubenswrapper[4698]: I0224 10:18:04.627631 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f6d24b42-65c5-4a01-8f4a-6f970714ab76-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-bhrhk\" (UID: \"f6d24b42-65c5-4a01-8f4a-6f970714ab76\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bhrhk" Feb 24 10:18:04 crc kubenswrapper[4698]: I0224 10:18:04.627821 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knwn8\" (UniqueName: \"kubernetes.io/projected/f6d24b42-65c5-4a01-8f4a-6f970714ab76-kube-api-access-knwn8\") pod \"ovnkube-control-plane-749d76644c-bhrhk\" (UID: \"f6d24b42-65c5-4a01-8f4a-6f970714ab76\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bhrhk" Feb 24 10:18:04 crc kubenswrapper[4698]: I0224 10:18:04.627903 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f6d24b42-65c5-4a01-8f4a-6f970714ab76-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-bhrhk\" (UID: \"f6d24b42-65c5-4a01-8f4a-6f970714ab76\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bhrhk" Feb 24 10:18:04 crc kubenswrapper[4698]: I0224 10:18:04.631312 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4539d49e9935099b59be97e672ffbe6a2a831b9261939a5afba45e16aab5c2dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:04Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:04 crc kubenswrapper[4698]: I0224 10:18:04.651203 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e70623bb6b1c9ba54ae662592cd2861cea4181853f6595a595390c81820c287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://863cee3a2b2acf3e3138d4e13d27a2b4229d619661f97eab920e5a4ee7ae2c51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:04Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:04 crc kubenswrapper[4698]: I0224 10:18:04.672600 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:04Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:04 crc kubenswrapper[4698]: I0224 10:18:04.695062 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jlg97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90062989-bf1b-4479-89a0-f3bf0d438ac3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd966a1dd77be4accb00f38133ee9df9a0f98df5050d51996c9547a95c361cfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f570e20898252544de2e4987e3ec3baea2d46904749fc01664c969518d8babd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f570e20898252544de2e4987e3ec3baea2d46904749fc01664c969518d8babd6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86844171c4cdeecffa4831f9bba9b6d9c5eecbcc2220f880ccdb8819df60fa34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86844171c4cdeecffa4831f9bba9b6d9c5eecbcc2220f880ccdb8819df60fa34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42705a048e7832b1de855a97691620e572a7a7f38b90148e1cedd49003c649fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42705a048e7832b1de855a97691620e572a7a7f38b90148e1cedd49003c649fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5968e3b94b9d8996e9c4d4fdfab0576fcee049356dff5defd85f1a71ab652c41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5968e3b94b9d8996e9c4d4fdfab0576fcee049356dff5defd85f1a71ab652c41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05bd18aaa2469fc7380f98a513907e098a1cd45c794dae35894dc4caccaaeac8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05bd18aaa2469fc7380f98a513907e098a1cd45c794dae35894dc4caccaaeac8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c47b55214c6082bb9f8a18705983f9be95ef4c3b557d2d8f6cb8a33fa1fddd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c47b55214c6082bb9f8a18705983f9be95ef4c3b557d2d8f6cb8a33fa1fddd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jlg97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:04Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:04 crc kubenswrapper[4698]: I0224 10:18:04.715541 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:04 crc kubenswrapper[4698]: I0224 10:18:04.715578 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:04 crc kubenswrapper[4698]: I0224 10:18:04.715591 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:04 crc kubenswrapper[4698]: I0224 10:18:04.715608 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:04 crc kubenswrapper[4698]: I0224 10:18:04.715623 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:04Z","lastTransitionTime":"2026-02-24T10:18:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:04 crc kubenswrapper[4698]: I0224 10:18:04.718084 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"066df704-6981-4770-a647-df52a0da50a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60215d9a7dc3fbaa1b045a76c018c910f3748c5bef5325716e0a28844bc91ece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e27ae8c6aa803d58f6ff0252273d2fcbbee794c49a13fc54bfe6677b5aa6e07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7adc5b73bdd01b1e822308534c8848e154a1d05ed5367b971b59a99289387585\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2ec337c851d86c491d1ae5a667e4344ae4759f945b423d3a48838874a6eda20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://444da705b890c795bca82d2bd44ad5b71ed9bcc95a70ee5c92755679af31aa99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://096010abeb5f4fc1cf8ab2a1a3e50000365a449d0747081df923bde1be7e1213\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c4183b7a2d42eded3a4a62df9ef06d127a9a288a8e51010277cb370cf9d019e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98d54a05ba6b01c286162caad21bbde6abb02b6690a6d4d2ade8faefd19a606a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-24T10:18:03Z\\\",\\\"message\\\":\\\":311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0224 10:18:02.975081 6372 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0224 10:18:02.975197 6372 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0224 10:18:02.977432 6372 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0224 10:18:02.977496 6372 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0224 10:18:02.977502 6372 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0224 10:18:02.977524 6372 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0224 10:18:02.977556 6372 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0224 10:18:02.977595 6372 factory.go:656] Stopping watch factory\\\\nI0224 10:18:02.977609 6372 ovnkube.go:599] Stopped ovnkube\\\\nI0224 10:18:02.977666 6372 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0224 10:18:02.977676 6372 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0224 10:18:02.977693 6372 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0224 10:18:02.977701 6372 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0224 10:18:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1288272246b8937c2880153451d797fc3328749902e2491e60c8f8f086c85288\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://363eade2263b2108feaaf0620f7f1fd910effb90ce635e5b749b59b407618443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://363eade2263b2108feaaf0620f7f1fd910effb90ce635e5b749b59b407618443\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mgh7p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:04Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:04 crc kubenswrapper[4698]: I0224 10:18:04.728573 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f6d24b42-65c5-4a01-8f4a-6f970714ab76-env-overrides\") pod \"ovnkube-control-plane-749d76644c-bhrhk\" (UID: \"f6d24b42-65c5-4a01-8f4a-6f970714ab76\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bhrhk" Feb 24 10:18:04 crc kubenswrapper[4698]: I0224 10:18:04.728624 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f6d24b42-65c5-4a01-8f4a-6f970714ab76-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-bhrhk\" (UID: \"f6d24b42-65c5-4a01-8f4a-6f970714ab76\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bhrhk" Feb 24 10:18:04 crc kubenswrapper[4698]: I0224 10:18:04.728669 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knwn8\" (UniqueName: \"kubernetes.io/projected/f6d24b42-65c5-4a01-8f4a-6f970714ab76-kube-api-access-knwn8\") pod \"ovnkube-control-plane-749d76644c-bhrhk\" (UID: \"f6d24b42-65c5-4a01-8f4a-6f970714ab76\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bhrhk" Feb 24 10:18:04 crc kubenswrapper[4698]: I0224 10:18:04.728706 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f6d24b42-65c5-4a01-8f4a-6f970714ab76-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-bhrhk\" (UID: \"f6d24b42-65c5-4a01-8f4a-6f970714ab76\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bhrhk" Feb 24 10:18:04 crc kubenswrapper[4698]: I0224 10:18:04.729615 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f6d24b42-65c5-4a01-8f4a-6f970714ab76-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-bhrhk\" (UID: \"f6d24b42-65c5-4a01-8f4a-6f970714ab76\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bhrhk" Feb 24 10:18:04 crc kubenswrapper[4698]: I0224 10:18:04.730023 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f6d24b42-65c5-4a01-8f4a-6f970714ab76-env-overrides\") pod \"ovnkube-control-plane-749d76644c-bhrhk\" (UID: \"f6d24b42-65c5-4a01-8f4a-6f970714ab76\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bhrhk" Feb 24 10:18:04 crc kubenswrapper[4698]: I0224 10:18:04.731037 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mb4d7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc3c474c-e869-4b47-94c5-f1ab3ce3c843\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d49238acba0219497644e528a1e99906b8e7e5d4a61033354fa8b7b9708b5e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d8kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mb4d7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:04Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:04 crc kubenswrapper[4698]: I0224 10:18:04.738432 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f6d24b42-65c5-4a01-8f4a-6f970714ab76-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-bhrhk\" (UID: \"f6d24b42-65c5-4a01-8f4a-6f970714ab76\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bhrhk" Feb 24 10:18:04 crc kubenswrapper[4698]: I0224 10:18:04.749030 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b70223850a461f607af8055fb157db676ed4dd9537481c41f21b8b85dc955c6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:04Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:04 crc kubenswrapper[4698]: I0224 10:18:04.758688 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knwn8\" (UniqueName: \"kubernetes.io/projected/f6d24b42-65c5-4a01-8f4a-6f970714ab76-kube-api-access-knwn8\") pod \"ovnkube-control-plane-749d76644c-bhrhk\" (UID: \"f6d24b42-65c5-4a01-8f4a-6f970714ab76\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bhrhk" Feb 24 10:18:04 crc kubenswrapper[4698]: I0224 10:18:04.765026 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bhrhk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d24b42-65c5-4a01-8f4a-6f970714ab76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-knwn8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-knwn8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:18:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bhrhk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:04Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:04 crc kubenswrapper[4698]: I0224 10:18:04.818403 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:04 crc kubenswrapper[4698]: I0224 10:18:04.818438 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:04 crc kubenswrapper[4698]: I0224 10:18:04.818452 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:04 crc kubenswrapper[4698]: I0224 10:18:04.818474 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:04 crc kubenswrapper[4698]: I0224 10:18:04.818488 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:04Z","lastTransitionTime":"2026-02-24T10:18:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:04 crc kubenswrapper[4698]: I0224 10:18:04.823957 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bhrhk" Feb 24 10:18:04 crc kubenswrapper[4698]: W0224 10:18:04.842534 4698 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6d24b42_65c5_4a01_8f4a_6f970714ab76.slice/crio-fde784a5cfa847955b5912fcdd5a7522785c1719b8b065ed0798c9fc76a6e0f1 WatchSource:0}: Error finding container fde784a5cfa847955b5912fcdd5a7522785c1719b8b065ed0798c9fc76a6e0f1: Status 404 returned error can't find the container with id fde784a5cfa847955b5912fcdd5a7522785c1719b8b065ed0798c9fc76a6e0f1 Feb 24 10:18:04 crc kubenswrapper[4698]: I0224 10:18:04.921643 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:04 crc kubenswrapper[4698]: I0224 10:18:04.921692 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:04 crc kubenswrapper[4698]: I0224 10:18:04.921709 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:04 crc kubenswrapper[4698]: I0224 10:18:04.921733 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:04 crc kubenswrapper[4698]: I0224 10:18:04.921750 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:04Z","lastTransitionTime":"2026-02-24T10:18:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.031572 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.031637 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.033513 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.034207 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.034337 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:05Z","lastTransitionTime":"2026-02-24T10:18:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.137988 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.138037 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.138053 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.138075 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.138094 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:05Z","lastTransitionTime":"2026-02-24T10:18:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.197824 4698 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mgh7p_066df704-6981-4770-a647-df52a0da50a0/ovnkube-controller/1.log" Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.198682 4698 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mgh7p_066df704-6981-4770-a647-df52a0da50a0/ovnkube-controller/0.log" Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.201467 4698 generic.go:334] "Generic (PLEG): container finished" podID="066df704-6981-4770-a647-df52a0da50a0" containerID="4c4183b7a2d42eded3a4a62df9ef06d127a9a288a8e51010277cb370cf9d019e" exitCode=1 Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.201562 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" event={"ID":"066df704-6981-4770-a647-df52a0da50a0","Type":"ContainerDied","Data":"4c4183b7a2d42eded3a4a62df9ef06d127a9a288a8e51010277cb370cf9d019e"} Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.201654 4698 scope.go:117] "RemoveContainer" containerID="98d54a05ba6b01c286162caad21bbde6abb02b6690a6d4d2ade8faefd19a606a" Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.202517 4698 scope.go:117] "RemoveContainer" containerID="4c4183b7a2d42eded3a4a62df9ef06d127a9a288a8e51010277cb370cf9d019e" Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.202638 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bhrhk" event={"ID":"f6d24b42-65c5-4a01-8f4a-6f970714ab76","Type":"ContainerStarted","Data":"fde784a5cfa847955b5912fcdd5a7522785c1719b8b065ed0798c9fc76a6e0f1"} Feb 24 10:18:05 crc kubenswrapper[4698]: E0224 10:18:05.202767 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-mgh7p_openshift-ovn-kubernetes(066df704-6981-4770-a647-df52a0da50a0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" podUID="066df704-6981-4770-a647-df52a0da50a0" Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.214851 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b70223850a461f607af8055fb157db676ed4dd9537481c41f21b8b85dc955c6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:05Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.227563 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e70623bb6b1c9ba54ae662592cd2861cea4181853f6595a595390c81820c287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://863cee3a2b2acf3e3138d4e13d27a2b4229d619661f97eab920e5a4ee7ae2c51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:05Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.238326 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:05Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.242435 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.242509 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.242535 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.242566 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.242592 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:05Z","lastTransitionTime":"2026-02-24T10:18:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.258294 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jlg97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90062989-bf1b-4479-89a0-f3bf0d438ac3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd966a1dd77be4accb00f38133ee9df9a0f98df5050d51996c9547a95c361cfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f570e20898252544de2e4987e3ec3baea2d46904749fc01664c969518d8babd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f570e20898252544de2e4987e3ec3baea2d46904749fc01664c969518d8babd6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86844171c4cdeecffa4831f9bba9b6d9c5eecbcc2220f880ccdb8819df60fa34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86844171c4cdeecffa4831f9bba9b6d9c5eecbcc2220f880ccdb8819df60fa34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42705a048e7832b1de855a97691620e572a7a7f38b90148e1cedd49003c649fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42705a048e7832b1de855a97691620e572a7a7f38b90148e1cedd49003c649fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5968e3b94b9d8996e9c4d4fdfab0576fcee049356dff5defd85f1a71ab652c41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5968e3b94b9d8996e9c4d4fdfab0576fcee049356dff5defd85f1a71ab652c41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05bd18aaa2469fc7380f98a513907e098a1cd45c794dae35894dc4caccaaeac8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05bd18aaa2469fc7380f98a513907e098a1cd45c794dae35894dc4caccaaeac8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c47b55214c6082bb9f8a18705983f9be95ef4c3b557d2d8f6cb8a33fa1fddd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c47b55214c6082bb9f8a18705983f9be95ef4c3b557d2d8f6cb8a33fa1fddd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jlg97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:05Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.282937 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"066df704-6981-4770-a647-df52a0da50a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60215d9a7dc3fbaa1b045a76c018c910f3748c5bef5325716e0a28844bc91ece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e27ae8c6aa803d58f6ff0252273d2fcbbee794c49a13fc54bfe6677b5aa6e07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7adc5b73bdd01b1e822308534c8848e154a1d05ed5367b971b59a99289387585\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2ec337c851d86c491d1ae5a667e4344ae4759f945b423d3a48838874a6eda20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://444da705b890c795bca82d2bd44ad5b71ed9bcc95a70ee5c92755679af31aa99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://096010abeb5f4fc1cf8ab2a1a3e50000365a449d0747081df923bde1be7e1213\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c4183b7a2d42eded3a4a62df9ef06d127a9a288a8e51010277cb370cf9d019e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98d54a05ba6b01c286162caad21bbde6abb02b6690a6d4d2ade8faefd19a606a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-24T10:18:03Z\\\",\\\"message\\\":\\\":311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0224 10:18:02.975081 6372 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0224 10:18:02.975197 6372 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0224 10:18:02.977432 6372 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0224 10:18:02.977496 6372 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0224 10:18:02.977502 6372 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0224 10:18:02.977524 6372 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0224 10:18:02.977556 6372 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0224 10:18:02.977595 6372 factory.go:656] Stopping watch factory\\\\nI0224 10:18:02.977609 6372 ovnkube.go:599] Stopped ovnkube\\\\nI0224 10:18:02.977666 6372 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0224 10:18:02.977676 6372 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0224 10:18:02.977693 6372 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0224 10:18:02.977701 6372 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0224 10:18:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c4183b7a2d42eded3a4a62df9ef06d127a9a288a8e51010277cb370cf9d019e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-24T10:18:05Z\\\",\\\"message\\\":\\\"5 6551 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-controller-manager/kube-controller-manager\\\\\\\"}\\\\nI0224 10:18:04.755926 6551 services_controller.go:360] Finished syncing service kube-controller-manager on namespace openshift-kube-controller-manager for network=default : 3.573648ms\\\\nI0224 10:18:04.756185 6551 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-apiserver/apiserver\\\\\\\"}\\\\nI0224 10:18:04.756205 6551 services_controller.go:360] Finished syncing service apiserver on namespace openshift-kube-apiserver for network=default : 3.903445ms\\\\nI0224 10:18:04.756473 6551 obj_retry.go:551] Creating *factory.egressNode crc took: 8.810484ms\\\\nI0224 10:18:04.756500 6551 factory.go:1336] Added *v1.Node event handler 7\\\\nI0224 10:18:04.756534 6551 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0224 10:18:04.756802 6551 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0224 10:18:04.756875 6551 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0224 10:18:04.756903 6551 ovnkube.go:599] Stopped ovnkube\\\\nI0224 10:18:04.756935 6551 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0224 10:18:04.756995 6551 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1288272246b8937c2880153451d797fc3328749902e2491e60c8f8f086c85288\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://363eade2263b2108feaaf0620f7f1fd910effb90ce635e5b749b59b407618443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://363eade2263b2108feaaf0620f7f1fd910effb90ce635e5b749b59b407618443\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mgh7p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:05Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.297030 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mb4d7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc3c474c-e869-4b47-94c5-f1ab3ce3c843\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d49238acba0219497644e528a1e99906b8e7e5d4a61033354fa8b7b9708b5e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d8kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mb4d7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:05Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.310677 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bhrhk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d24b42-65c5-4a01-8f4a-6f970714ab76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-knwn8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-knwn8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:18:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bhrhk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:05Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.328641 4698 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-rpnnm"] Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.329985 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpnnm" Feb 24 10:18:05 crc kubenswrapper[4698]: E0224 10:18:05.330109 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpnnm" podUID="17a1338b-6385-4795-9397-74316d6599d9" Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.332791 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34fd32d5-5aed-4abb-bf14-ab1b8b02b516\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b9d9ca2f4ccd094b55e3e27cef8afddae5dc7de81912aba64ca6a6671f14a35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42a2655047e1fb057b615781d8c2ccf50f62f2a70749ef8bb214d32edaba2b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e1bb75600de7e41c8a04ba010078c753b55d05aae7a18f945c2027ba48ee30c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa3d4a95fd60ff55d1850deb923135ed607172e7676a141a5d52e6cdd60b23bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64b39341e105fbe8aa9dc4c108f6ee8a2bff33568a205e32e639b8382ab2ccb2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T10:17:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0224 10:17:08.346350 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0224 10:17:08.346447 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0224 10:17:08.346900 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705878618/tls.crt::/tmp/serving-cert-3705878618/tls.key\\\\\\\"\\\\nI0224 10:17:08.624012 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0224 10:17:08.625525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0224 10:17:08.625540 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0224 10:17:08.625560 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0224 10:17:08.625565 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0224 10:17:08.629654 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0224 10:17:08.629666 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0224 10:17:08.629711 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:17:08.629725 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:17:08.629739 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0224 10:17:08.629749 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0224 10:17:08.629758 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0224 10:17:08.629766 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0224 10:17:08.630467 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://674ed085a7507742c61fdb7dae4678b08e315a3679788c5dcbb4df97cdc27c61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9e1b116db9c76dec99d1ac4af98e5ee081f2a171a19093ba5628b676356f34b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9e1b116db9c76dec99d1ac4af98e5ee081f2a171a19093ba5628b676356f34b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:16:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:16:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:16:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:05Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.345366 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.345427 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.345447 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.345473 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.345492 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:05Z","lastTransitionTime":"2026-02-24T10:18:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.349586 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-29rvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9cba56db-d46e-4a34-9863-47e4dce27ca5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f62a06c2933f02c75637172be87adadd015a2aad2750f553bb2e99c38fbec74b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fk9xv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-29rvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:05Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.371597 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:05Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.392461 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4539d49e9935099b59be97e672ffbe6a2a831b9261939a5afba45e16aab5c2dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:05Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.404897 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-nn578" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4ee0bb1-125d-4852-a54d-7dadf6177545\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67e08c23594b195088f0a11823556880d9f809097ec231acf6c4ddbcf5c085b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9ngd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0c8bc2bc5ebfb2472863808bf33f95f8aa74ed45b546ed1a1b3be4883af700e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9ngd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-nn578\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:05Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.421810 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7mbk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17dd9ce8-b1ca-4810-85fe-9775919eb4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac059400b5a17e1f1dc36d2fe35b5c8ace2dad5326f3933873eae644e1786c54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgnjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7mbk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:05Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.436948 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:05Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.437021 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/17a1338b-6385-4795-9397-74316d6599d9-metrics-certs\") pod \"network-metrics-daemon-rpnnm\" (UID: \"17a1338b-6385-4795-9397-74316d6599d9\") " pod="openshift-multus/network-metrics-daemon-rpnnm" Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.437105 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7xll\" (UniqueName: \"kubernetes.io/projected/17a1338b-6385-4795-9397-74316d6599d9-kube-api-access-t7xll\") pod \"network-metrics-daemon-rpnnm\" (UID: \"17a1338b-6385-4795-9397-74316d6599d9\") " pod="openshift-multus/network-metrics-daemon-rpnnm" Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.447366 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.447426 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.447438 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.447455 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.447469 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:05Z","lastTransitionTime":"2026-02-24T10:18:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.454817 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e70623bb6b1c9ba54ae662592cd2861cea4181853f6595a595390c81820c287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://863cee3a2b2acf3e3138d4e13d27a2b4229d619661f97eab920e5a4ee7ae2c51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:05Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.471619 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:05Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.491779 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jlg97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90062989-bf1b-4479-89a0-f3bf0d438ac3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd966a1dd77be4accb00f38133ee9df9a0f98df5050d51996c9547a95c361cfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f570e20898252544de2e4987e3ec3baea2d46904749fc01664c969518d8babd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f570e20898252544de2e4987e3ec3baea2d46904749fc01664c969518d8babd6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86844171c4cdeecffa4831f9bba9b6d9c5eecbcc2220f880ccdb8819df60fa34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86844171c4cdeecffa4831f9bba9b6d9c5eecbcc2220f880ccdb8819df60fa34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42705a048e7832b1de855a97691620e572a7a7f38b90148e1cedd49003c649fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42705a048e7832b1de855a97691620e572a7a7f38b90148e1cedd49003c649fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5968e3b94b9d8996e9c4d4fdfab0576fcee049356dff5defd85f1a71ab652c41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5968e3b94b9d8996e9c4d4fdfab0576fcee049356dff5defd85f1a71ab652c41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05bd18aaa2469fc7380f98a513907e098a1cd45c794dae35894dc4caccaaeac8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05bd18aaa2469fc7380f98a513907e098a1cd45c794dae35894dc4caccaaeac8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c47b55214c6082bb9f8a18705983f9be95ef4c3b557d2d8f6cb8a33fa1fddd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c47b55214c6082bb9f8a18705983f9be95ef4c3b557d2d8f6cb8a33fa1fddd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jlg97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:05Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.520071 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"066df704-6981-4770-a647-df52a0da50a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60215d9a7dc3fbaa1b045a76c018c910f3748c5bef5325716e0a28844bc91ece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e27ae8c6aa803d58f6ff0252273d2fcbbee794c49a13fc54bfe6677b5aa6e07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7adc5b73bdd01b1e822308534c8848e154a1d05ed5367b971b59a99289387585\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2ec337c851d86c491d1ae5a667e4344ae4759f945b423d3a48838874a6eda20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://444da705b890c795bca82d2bd44ad5b71ed9bcc95a70ee5c92755679af31aa99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://096010abeb5f4fc1cf8ab2a1a3e50000365a449d0747081df923bde1be7e1213\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c4183b7a2d42eded3a4a62df9ef06d127a9a288a8e51010277cb370cf9d019e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98d54a05ba6b01c286162caad21bbde6abb02b6690a6d4d2ade8faefd19a606a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-24T10:18:03Z\\\",\\\"message\\\":\\\":311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0224 10:18:02.975081 6372 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0224 10:18:02.975197 6372 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0224 10:18:02.977432 6372 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0224 10:18:02.977496 6372 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0224 10:18:02.977502 6372 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0224 10:18:02.977524 6372 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0224 10:18:02.977556 6372 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0224 10:18:02.977595 6372 factory.go:656] Stopping watch factory\\\\nI0224 10:18:02.977609 6372 ovnkube.go:599] Stopped ovnkube\\\\nI0224 10:18:02.977666 6372 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0224 10:18:02.977676 6372 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0224 10:18:02.977693 6372 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0224 10:18:02.977701 6372 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0224 10:18:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c4183b7a2d42eded3a4a62df9ef06d127a9a288a8e51010277cb370cf9d019e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-24T10:18:05Z\\\",\\\"message\\\":\\\"5 6551 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-controller-manager/kube-controller-manager\\\\\\\"}\\\\nI0224 10:18:04.755926 6551 services_controller.go:360] Finished syncing service kube-controller-manager on namespace openshift-kube-controller-manager for network=default : 3.573648ms\\\\nI0224 10:18:04.756185 6551 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-apiserver/apiserver\\\\\\\"}\\\\nI0224 10:18:04.756205 6551 services_controller.go:360] Finished syncing service apiserver on namespace openshift-kube-apiserver for network=default : 3.903445ms\\\\nI0224 10:18:04.756473 6551 obj_retry.go:551] Creating *factory.egressNode crc took: 8.810484ms\\\\nI0224 10:18:04.756500 6551 factory.go:1336] Added *v1.Node event handler 7\\\\nI0224 10:18:04.756534 6551 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0224 10:18:04.756802 6551 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0224 10:18:04.756875 6551 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0224 10:18:04.756903 6551 ovnkube.go:599] Stopped ovnkube\\\\nI0224 10:18:04.756935 6551 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0224 10:18:04.756995 6551 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1288272246b8937c2880153451d797fc3328749902e2491e60c8f8f086c85288\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://363eade2263b2108feaaf0620f7f1fd910effb90ce635e5b749b59b407618443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://363eade2263b2108feaaf0620f7f1fd910effb90ce635e5b749b59b407618443\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mgh7p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:05Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.538063 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mb4d7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc3c474c-e869-4b47-94c5-f1ab3ce3c843\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d49238acba0219497644e528a1e99906b8e7e5d4a61033354fa8b7b9708b5e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d8kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mb4d7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:05Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.538572 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/17a1338b-6385-4795-9397-74316d6599d9-metrics-certs\") pod \"network-metrics-daemon-rpnnm\" (UID: \"17a1338b-6385-4795-9397-74316d6599d9\") " pod="openshift-multus/network-metrics-daemon-rpnnm" Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.538678 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7xll\" (UniqueName: \"kubernetes.io/projected/17a1338b-6385-4795-9397-74316d6599d9-kube-api-access-t7xll\") pod \"network-metrics-daemon-rpnnm\" (UID: \"17a1338b-6385-4795-9397-74316d6599d9\") " pod="openshift-multus/network-metrics-daemon-rpnnm" Feb 24 10:18:05 crc kubenswrapper[4698]: E0224 10:18:05.538741 4698 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 24 10:18:05 crc kubenswrapper[4698]: E0224 10:18:05.538846 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/17a1338b-6385-4795-9397-74316d6599d9-metrics-certs podName:17a1338b-6385-4795-9397-74316d6599d9 nodeName:}" failed. No retries permitted until 2026-02-24 10:18:06.038815316 +0000 UTC m=+111.152429607 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/17a1338b-6385-4795-9397-74316d6599d9-metrics-certs") pod "network-metrics-daemon-rpnnm" (UID: "17a1338b-6385-4795-9397-74316d6599d9") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.550557 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.550595 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.550610 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.550630 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.550645 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:05Z","lastTransitionTime":"2026-02-24T10:18:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.555341 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b70223850a461f607af8055fb157db676ed4dd9537481c41f21b8b85dc955c6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:05Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.564197 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7xll\" (UniqueName: \"kubernetes.io/projected/17a1338b-6385-4795-9397-74316d6599d9-kube-api-access-t7xll\") pod \"network-metrics-daemon-rpnnm\" (UID: \"17a1338b-6385-4795-9397-74316d6599d9\") " pod="openshift-multus/network-metrics-daemon-rpnnm" Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.570654 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bhrhk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d24b42-65c5-4a01-8f4a-6f970714ab76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-knwn8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-knwn8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:18:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bhrhk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:05Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.586166 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34fd32d5-5aed-4abb-bf14-ab1b8b02b516\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b9d9ca2f4ccd094b55e3e27cef8afddae5dc7de81912aba64ca6a6671f14a35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42a2655047e1fb057b615781d8c2ccf50f62f2a70749ef8bb214d32edaba2b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e1bb75600de7e41c8a04ba010078c753b55d05aae7a18f945c2027ba48ee30c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa3d4a95fd60ff55d1850deb923135ed607172e7676a141a5d52e6cdd60b23bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64b39341e105fbe8aa9dc4c108f6ee8a2bff33568a205e32e639b8382ab2ccb2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T10:17:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0224 10:17:08.346350 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0224 10:17:08.346447 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0224 10:17:08.346900 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705878618/tls.crt::/tmp/serving-cert-3705878618/tls.key\\\\\\\"\\\\nI0224 10:17:08.624012 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0224 10:17:08.625525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0224 10:17:08.625540 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0224 10:17:08.625560 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0224 10:17:08.625565 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0224 10:17:08.629654 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0224 10:17:08.629666 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0224 10:17:08.629711 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:17:08.629725 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:17:08.629739 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0224 10:17:08.629749 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0224 10:17:08.629758 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0224 10:17:08.629766 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0224 10:17:08.630467 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://674ed085a7507742c61fdb7dae4678b08e315a3679788c5dcbb4df97cdc27c61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9e1b116db9c76dec99d1ac4af98e5ee081f2a171a19093ba5628b676356f34b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9e1b116db9c76dec99d1ac4af98e5ee081f2a171a19093ba5628b676356f34b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:16:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:16:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:16:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:05Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.600228 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-29rvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9cba56db-d46e-4a34-9863-47e4dce27ca5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f62a06c2933f02c75637172be87adadd015a2aad2750f553bb2e99c38fbec74b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fk9xv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-29rvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:05Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.615498 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.616598 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:18:05 crc kubenswrapper[4698]: E0224 10:18:05.616805 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:18:05 crc kubenswrapper[4698]: E0224 10:18:05.617469 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.617748 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:05Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.637101 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-nn578" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4ee0bb1-125d-4852-a54d-7dadf6177545\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67e08c23594b195088f0a11823556880d9f809097ec231acf6c4ddbcf5c085b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9ngd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0c8bc2bc5ebfb2472863808bf33f95f8aa74ed45b546ed1a1b3be4883af700e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9ngd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-nn578\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:05Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.653801 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7mbk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17dd9ce8-b1ca-4810-85fe-9775919eb4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac059400b5a17e1f1dc36d2fe35b5c8ace2dad5326f3933873eae644e1786c54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgnjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7mbk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:05Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.653923 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.653977 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.654002 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.654034 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.654061 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:05Z","lastTransitionTime":"2026-02-24T10:18:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.675476 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rpnnm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17a1338b-6385-4795-9397-74316d6599d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7xll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7xll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:18:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rpnnm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:05Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.693718 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:05Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.706869 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4539d49e9935099b59be97e672ffbe6a2a831b9261939a5afba45e16aab5c2dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:05Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.753177 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34fd32d5-5aed-4abb-bf14-ab1b8b02b516\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b9d9ca2f4ccd094b55e3e27cef8afddae5dc7de81912aba64ca6a6671f14a35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42a2655047e1fb057b615781d8c2ccf50f62f2a70749ef8bb214d32edaba2b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e1bb75600de7e41c8a04ba010078c753b55d05aae7a18f945c2027ba48ee30c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa3d4a95fd60ff55d1850deb923135ed607172e7676a141a5d52e6cdd60b23bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64b39341e105fbe8aa9dc4c108f6ee8a2bff33568a205e32e639b8382ab2ccb2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T10:17:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0224 10:17:08.346350 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0224 10:17:08.346447 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0224 10:17:08.346900 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705878618/tls.crt::/tmp/serving-cert-3705878618/tls.key\\\\\\\"\\\\nI0224 10:17:08.624012 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0224 10:17:08.625525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0224 10:17:08.625540 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0224 10:17:08.625560 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0224 10:17:08.625565 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0224 10:17:08.629654 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0224 10:17:08.629666 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0224 10:17:08.629711 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:17:08.629725 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:17:08.629739 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0224 10:17:08.629749 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0224 10:17:08.629758 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0224 10:17:08.629766 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0224 10:17:08.630467 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://674ed085a7507742c61fdb7dae4678b08e315a3679788c5dcbb4df97cdc27c61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9e1b116db9c76dec99d1ac4af98e5ee081f2a171a19093ba5628b676356f34b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9e1b116db9c76dec99d1ac4af98e5ee081f2a171a19093ba5628b676356f34b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:16:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:16:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:16:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:05Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.756802 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.756845 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.756858 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.756875 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.756888 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:05Z","lastTransitionTime":"2026-02-24T10:18:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.768770 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-29rvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9cba56db-d46e-4a34-9863-47e4dce27ca5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f62a06c2933f02c75637172be87adadd015a2aad2750f553bb2e99c38fbec74b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fk9xv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-29rvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:05Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.779691 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:05Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.789073 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:05Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.799762 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4539d49e9935099b59be97e672ffbe6a2a831b9261939a5afba45e16aab5c2dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:05Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.809672 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-nn578" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4ee0bb1-125d-4852-a54d-7dadf6177545\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67e08c23594b195088f0a11823556880d9f809097ec231acf6c4ddbcf5c085b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9ngd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0c8bc2bc5ebfb2472863808bf33f95f8aa74ed45b546ed1a1b3be4883af700e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9ngd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-nn578\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:05Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.819673 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7mbk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17dd9ce8-b1ca-4810-85fe-9775919eb4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac059400b5a17e1f1dc36d2fe35b5c8ace2dad5326f3933873eae644e1786c54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgnjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7mbk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:05Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.828467 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rpnnm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17a1338b-6385-4795-9397-74316d6599d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7xll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7xll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:18:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rpnnm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:05Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.837454 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mb4d7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc3c474c-e869-4b47-94c5-f1ab3ce3c843\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d49238acba0219497644e528a1e99906b8e7e5d4a61033354fa8b7b9708b5e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d8kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mb4d7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:05Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.847169 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b70223850a461f607af8055fb157db676ed4dd9537481c41f21b8b85dc955c6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:05Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.858661 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.858686 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.858694 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.858707 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.858715 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:05Z","lastTransitionTime":"2026-02-24T10:18:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.859638 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e70623bb6b1c9ba54ae662592cd2861cea4181853f6595a595390c81820c287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://863cee3a2b2acf3e3138d4e13d27a2b4229d619661f97eab920e5a4ee7ae2c51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:05Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.871156 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:05Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.888168 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jlg97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90062989-bf1b-4479-89a0-f3bf0d438ac3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd966a1dd77be4accb00f38133ee9df9a0f98df5050d51996c9547a95c361cfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f570e20898252544de2e4987e3ec3baea2d46904749fc01664c969518d8babd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f570e20898252544de2e4987e3ec3baea2d46904749fc01664c969518d8babd6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86844171c4cdeecffa4831f9bba9b6d9c5eecbcc2220f880ccdb8819df60fa34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86844171c4cdeecffa4831f9bba9b6d9c5eecbcc2220f880ccdb8819df60fa34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42705a048e7832b1de855a97691620e572a7a7f38b90148e1cedd49003c649fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42705a048e7832b1de855a97691620e572a7a7f38b90148e1cedd49003c649fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5968e3b94b9d8996e9c4d4fdfab0576fcee049356dff5defd85f1a71ab652c41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5968e3b94b9d8996e9c4d4fdfab0576fcee049356dff5defd85f1a71ab652c41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05bd18aaa2469fc7380f98a513907e098a1cd45c794dae35894dc4caccaaeac8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05bd18aaa2469fc7380f98a513907e098a1cd45c794dae35894dc4caccaaeac8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c47b55214c6082bb9f8a18705983f9be95ef4c3b557d2d8f6cb8a33fa1fddd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c47b55214c6082bb9f8a18705983f9be95ef4c3b557d2d8f6cb8a33fa1fddd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jlg97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:05Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.911846 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"066df704-6981-4770-a647-df52a0da50a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60215d9a7dc3fbaa1b045a76c018c910f3748c5bef5325716e0a28844bc91ece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e27ae8c6aa803d58f6ff0252273d2fcbbee794c49a13fc54bfe6677b5aa6e07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7adc5b73bdd01b1e822308534c8848e154a1d05ed5367b971b59a99289387585\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2ec337c851d86c491d1ae5a667e4344ae4759f945b423d3a48838874a6eda20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://444da705b890c795bca82d2bd44ad5b71ed9bcc95a70ee5c92755679af31aa99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://096010abeb5f4fc1cf8ab2a1a3e50000365a449d0747081df923bde1be7e1213\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c4183b7a2d42eded3a4a62df9ef06d127a9a288a8e51010277cb370cf9d019e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://98d54a05ba6b01c286162caad21bbde6abb02b6690a6d4d2ade8faefd19a606a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-24T10:18:03Z\\\",\\\"message\\\":\\\":311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0224 10:18:02.975081 6372 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0224 10:18:02.975197 6372 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0224 10:18:02.977432 6372 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0224 10:18:02.977496 6372 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0224 10:18:02.977502 6372 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0224 10:18:02.977524 6372 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0224 10:18:02.977556 6372 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0224 10:18:02.977595 6372 factory.go:656] Stopping watch factory\\\\nI0224 10:18:02.977609 6372 ovnkube.go:599] Stopped ovnkube\\\\nI0224 10:18:02.977666 6372 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0224 10:18:02.977676 6372 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0224 10:18:02.977693 6372 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0224 10:18:02.977701 6372 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0224 10:18:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c4183b7a2d42eded3a4a62df9ef06d127a9a288a8e51010277cb370cf9d019e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-24T10:18:05Z\\\",\\\"message\\\":\\\"5 6551 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-controller-manager/kube-controller-manager\\\\\\\"}\\\\nI0224 10:18:04.755926 6551 services_controller.go:360] Finished syncing service kube-controller-manager on namespace openshift-kube-controller-manager for network=default : 3.573648ms\\\\nI0224 10:18:04.756185 6551 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-apiserver/apiserver\\\\\\\"}\\\\nI0224 10:18:04.756205 6551 services_controller.go:360] Finished syncing service apiserver on namespace openshift-kube-apiserver for network=default : 3.903445ms\\\\nI0224 10:18:04.756473 6551 obj_retry.go:551] Creating *factory.egressNode crc took: 8.810484ms\\\\nI0224 10:18:04.756500 6551 factory.go:1336] Added *v1.Node event handler 7\\\\nI0224 10:18:04.756534 6551 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0224 10:18:04.756802 6551 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0224 10:18:04.756875 6551 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0224 10:18:04.756903 6551 ovnkube.go:599] Stopped ovnkube\\\\nI0224 10:18:04.756935 6551 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0224 10:18:04.756995 6551 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1288272246b8937c2880153451d797fc3328749902e2491e60c8f8f086c85288\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://363eade2263b2108feaaf0620f7f1fd910effb90ce635e5b749b59b407618443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://363eade2263b2108feaaf0620f7f1fd910effb90ce635e5b749b59b407618443\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mgh7p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:05Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.924022 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bhrhk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d24b42-65c5-4a01-8f4a-6f970714ab76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-knwn8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-knwn8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:18:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bhrhk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:05Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.961908 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.961944 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.961956 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.961974 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:05 crc kubenswrapper[4698]: I0224 10:18:05.961985 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:05Z","lastTransitionTime":"2026-02-24T10:18:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:06 crc kubenswrapper[4698]: I0224 10:18:06.047953 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/17a1338b-6385-4795-9397-74316d6599d9-metrics-certs\") pod \"network-metrics-daemon-rpnnm\" (UID: \"17a1338b-6385-4795-9397-74316d6599d9\") " pod="openshift-multus/network-metrics-daemon-rpnnm" Feb 24 10:18:06 crc kubenswrapper[4698]: E0224 10:18:06.048164 4698 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 24 10:18:06 crc kubenswrapper[4698]: E0224 10:18:06.048236 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/17a1338b-6385-4795-9397-74316d6599d9-metrics-certs podName:17a1338b-6385-4795-9397-74316d6599d9 nodeName:}" failed. No retries permitted until 2026-02-24 10:18:07.048214646 +0000 UTC m=+112.161828927 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/17a1338b-6385-4795-9397-74316d6599d9-metrics-certs") pod "network-metrics-daemon-rpnnm" (UID: "17a1338b-6385-4795-9397-74316d6599d9") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 24 10:18:06 crc kubenswrapper[4698]: I0224 10:18:06.064685 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:06 crc kubenswrapper[4698]: I0224 10:18:06.064735 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:06 crc kubenswrapper[4698]: I0224 10:18:06.064748 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:06 crc kubenswrapper[4698]: I0224 10:18:06.064768 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:06 crc kubenswrapper[4698]: I0224 10:18:06.064781 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:06Z","lastTransitionTime":"2026-02-24T10:18:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:06 crc kubenswrapper[4698]: I0224 10:18:06.169480 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:06 crc kubenswrapper[4698]: I0224 10:18:06.169808 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:06 crc kubenswrapper[4698]: I0224 10:18:06.170014 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:06 crc kubenswrapper[4698]: I0224 10:18:06.170242 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:06 crc kubenswrapper[4698]: I0224 10:18:06.170465 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:06Z","lastTransitionTime":"2026-02-24T10:18:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:06 crc kubenswrapper[4698]: I0224 10:18:06.210461 4698 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mgh7p_066df704-6981-4770-a647-df52a0da50a0/ovnkube-controller/1.log" Feb 24 10:18:06 crc kubenswrapper[4698]: I0224 10:18:06.216060 4698 scope.go:117] "RemoveContainer" containerID="4c4183b7a2d42eded3a4a62df9ef06d127a9a288a8e51010277cb370cf9d019e" Feb 24 10:18:06 crc kubenswrapper[4698]: E0224 10:18:06.216379 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-mgh7p_openshift-ovn-kubernetes(066df704-6981-4770-a647-df52a0da50a0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" podUID="066df704-6981-4770-a647-df52a0da50a0" Feb 24 10:18:06 crc kubenswrapper[4698]: I0224 10:18:06.217130 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bhrhk" event={"ID":"f6d24b42-65c5-4a01-8f4a-6f970714ab76","Type":"ContainerStarted","Data":"892da9f80566a48c6ace1fb4d7a16d824aad789a4ae631728a01c22a8d7b04f8"} Feb 24 10:18:06 crc kubenswrapper[4698]: I0224 10:18:06.218858 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bhrhk" event={"ID":"f6d24b42-65c5-4a01-8f4a-6f970714ab76","Type":"ContainerStarted","Data":"93b33a3866385dfb6006f052ecde4b52df1dad342d6392f0935f548b610c26e2"} Feb 24 10:18:06 crc kubenswrapper[4698]: I0224 10:18:06.235870 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:06Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:06 crc kubenswrapper[4698]: I0224 10:18:06.252356 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:06Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:06 crc kubenswrapper[4698]: I0224 10:18:06.268621 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4539d49e9935099b59be97e672ffbe6a2a831b9261939a5afba45e16aab5c2dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:06Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:06 crc kubenswrapper[4698]: I0224 10:18:06.273323 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:06 crc kubenswrapper[4698]: I0224 10:18:06.273415 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:06 crc kubenswrapper[4698]: I0224 10:18:06.273440 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:06 crc kubenswrapper[4698]: I0224 10:18:06.273471 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:06 crc kubenswrapper[4698]: I0224 10:18:06.273495 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:06Z","lastTransitionTime":"2026-02-24T10:18:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:06 crc kubenswrapper[4698]: I0224 10:18:06.283625 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-nn578" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4ee0bb1-125d-4852-a54d-7dadf6177545\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67e08c23594b195088f0a11823556880d9f809097ec231acf6c4ddbcf5c085b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9ngd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0c8bc2bc5ebfb2472863808bf33f95f8aa74ed45b546ed1a1b3be4883af700e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9ngd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-nn578\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:06Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:06 crc kubenswrapper[4698]: I0224 10:18:06.300633 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7mbk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17dd9ce8-b1ca-4810-85fe-9775919eb4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac059400b5a17e1f1dc36d2fe35b5c8ace2dad5326f3933873eae644e1786c54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgnjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7mbk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:06Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:06 crc kubenswrapper[4698]: I0224 10:18:06.312693 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rpnnm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17a1338b-6385-4795-9397-74316d6599d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7xll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7xll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:18:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rpnnm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:06Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:06 crc kubenswrapper[4698]: I0224 10:18:06.339559 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"066df704-6981-4770-a647-df52a0da50a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60215d9a7dc3fbaa1b045a76c018c910f3748c5bef5325716e0a28844bc91ece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e27ae8c6aa803d58f6ff0252273d2fcbbee794c49a13fc54bfe6677b5aa6e07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7adc5b73bdd01b1e822308534c8848e154a1d05ed5367b971b59a99289387585\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2ec337c851d86c491d1ae5a667e4344ae4759f945b423d3a48838874a6eda20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://444da705b890c795bca82d2bd44ad5b71ed9bcc95a70ee5c92755679af31aa99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://096010abeb5f4fc1cf8ab2a1a3e50000365a449d0747081df923bde1be7e1213\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c4183b7a2d42eded3a4a62df9ef06d127a9a288a8e51010277cb370cf9d019e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c4183b7a2d42eded3a4a62df9ef06d127a9a288a8e51010277cb370cf9d019e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-24T10:18:05Z\\\",\\\"message\\\":\\\"5 6551 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-controller-manager/kube-controller-manager\\\\\\\"}\\\\nI0224 10:18:04.755926 6551 services_controller.go:360] Finished syncing service kube-controller-manager on namespace openshift-kube-controller-manager for network=default : 3.573648ms\\\\nI0224 10:18:04.756185 6551 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-apiserver/apiserver\\\\\\\"}\\\\nI0224 10:18:04.756205 6551 services_controller.go:360] Finished syncing service apiserver on namespace openshift-kube-apiserver for network=default : 3.903445ms\\\\nI0224 10:18:04.756473 6551 obj_retry.go:551] Creating *factory.egressNode crc took: 8.810484ms\\\\nI0224 10:18:04.756500 6551 factory.go:1336] Added *v1.Node event handler 7\\\\nI0224 10:18:04.756534 6551 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0224 10:18:04.756802 6551 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0224 10:18:04.756875 6551 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0224 10:18:04.756903 6551 ovnkube.go:599] Stopped ovnkube\\\\nI0224 10:18:04.756935 6551 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0224 10:18:04.756995 6551 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:18:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-mgh7p_openshift-ovn-kubernetes(066df704-6981-4770-a647-df52a0da50a0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1288272246b8937c2880153451d797fc3328749902e2491e60c8f8f086c85288\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://363eade2263b2108feaaf0620f7f1fd910effb90ce635e5b749b59b407618443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://363eade2263b2108feaaf0620f7f1fd910effb90ce635e5b749b59b407618443\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mgh7p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:06Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:06 crc kubenswrapper[4698]: I0224 10:18:06.356363 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mb4d7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc3c474c-e869-4b47-94c5-f1ab3ce3c843\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d49238acba0219497644e528a1e99906b8e7e5d4a61033354fa8b7b9708b5e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d8kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mb4d7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:06Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:06 crc kubenswrapper[4698]: I0224 10:18:06.372845 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b70223850a461f607af8055fb157db676ed4dd9537481c41f21b8b85dc955c6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:06Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:06 crc kubenswrapper[4698]: I0224 10:18:06.377105 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:06 crc kubenswrapper[4698]: I0224 10:18:06.377381 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:06 crc kubenswrapper[4698]: I0224 10:18:06.377570 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:06 crc kubenswrapper[4698]: I0224 10:18:06.377761 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:06 crc kubenswrapper[4698]: I0224 10:18:06.377970 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:06Z","lastTransitionTime":"2026-02-24T10:18:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:06 crc kubenswrapper[4698]: I0224 10:18:06.393474 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e70623bb6b1c9ba54ae662592cd2861cea4181853f6595a595390c81820c287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://863cee3a2b2acf3e3138d4e13d27a2b4229d619661f97eab920e5a4ee7ae2c51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:06Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:06 crc kubenswrapper[4698]: I0224 10:18:06.406737 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:06Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:06 crc kubenswrapper[4698]: I0224 10:18:06.430547 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jlg97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90062989-bf1b-4479-89a0-f3bf0d438ac3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd966a1dd77be4accb00f38133ee9df9a0f98df5050d51996c9547a95c361cfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f570e20898252544de2e4987e3ec3baea2d46904749fc01664c969518d8babd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f570e20898252544de2e4987e3ec3baea2d46904749fc01664c969518d8babd6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86844171c4cdeecffa4831f9bba9b6d9c5eecbcc2220f880ccdb8819df60fa34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86844171c4cdeecffa4831f9bba9b6d9c5eecbcc2220f880ccdb8819df60fa34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42705a048e7832b1de855a97691620e572a7a7f38b90148e1cedd49003c649fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42705a048e7832b1de855a97691620e572a7a7f38b90148e1cedd49003c649fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5968e3b94b9d8996e9c4d4fdfab0576fcee049356dff5defd85f1a71ab652c41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5968e3b94b9d8996e9c4d4fdfab0576fcee049356dff5defd85f1a71ab652c41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05bd18aaa2469fc7380f98a513907e098a1cd45c794dae35894dc4caccaaeac8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05bd18aaa2469fc7380f98a513907e098a1cd45c794dae35894dc4caccaaeac8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c47b55214c6082bb9f8a18705983f9be95ef4c3b557d2d8f6cb8a33fa1fddd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c47b55214c6082bb9f8a18705983f9be95ef4c3b557d2d8f6cb8a33fa1fddd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jlg97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:06Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:06 crc kubenswrapper[4698]: I0224 10:18:06.447796 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bhrhk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d24b42-65c5-4a01-8f4a-6f970714ab76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:04Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:04Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-knwn8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-knwn8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:18:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bhrhk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:06Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:06 crc kubenswrapper[4698]: I0224 10:18:06.469088 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34fd32d5-5aed-4abb-bf14-ab1b8b02b516\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b9d9ca2f4ccd094b55e3e27cef8afddae5dc7de81912aba64ca6a6671f14a35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42a2655047e1fb057b615781d8c2ccf50f62f2a70749ef8bb214d32edaba2b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e1bb75600de7e41c8a04ba010078c753b55d05aae7a18f945c2027ba48ee30c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa3d4a95fd60ff55d1850deb923135ed607172e7676a141a5d52e6cdd60b23bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64b39341e105fbe8aa9dc4c108f6ee8a2bff33568a205e32e639b8382ab2ccb2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T10:17:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0224 10:17:08.346350 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0224 10:17:08.346447 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0224 10:17:08.346900 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705878618/tls.crt::/tmp/serving-cert-3705878618/tls.key\\\\\\\"\\\\nI0224 10:17:08.624012 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0224 10:17:08.625525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0224 10:17:08.625540 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0224 10:17:08.625560 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0224 10:17:08.625565 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0224 10:17:08.629654 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0224 10:17:08.629666 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0224 10:17:08.629711 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:17:08.629725 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:17:08.629739 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0224 10:17:08.629749 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0224 10:17:08.629758 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0224 10:17:08.629766 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0224 10:17:08.630467 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://674ed085a7507742c61fdb7dae4678b08e315a3679788c5dcbb4df97cdc27c61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9e1b116db9c76dec99d1ac4af98e5ee081f2a171a19093ba5628b676356f34b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9e1b116db9c76dec99d1ac4af98e5ee081f2a171a19093ba5628b676356f34b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:16:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:16:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:16:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:06Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:06 crc kubenswrapper[4698]: I0224 10:18:06.482934 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:06 crc kubenswrapper[4698]: I0224 10:18:06.483005 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:06 crc kubenswrapper[4698]: I0224 10:18:06.483017 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:06 crc kubenswrapper[4698]: I0224 10:18:06.483065 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:06 crc kubenswrapper[4698]: I0224 10:18:06.483079 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:06Z","lastTransitionTime":"2026-02-24T10:18:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:06 crc kubenswrapper[4698]: I0224 10:18:06.483946 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-29rvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9cba56db-d46e-4a34-9863-47e4dce27ca5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f62a06c2933f02c75637172be87adadd015a2aad2750f553bb2e99c38fbec74b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fk9xv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-29rvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:06Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:06 crc kubenswrapper[4698]: I0224 10:18:06.501459 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:06Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:06 crc kubenswrapper[4698]: I0224 10:18:06.520913 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4539d49e9935099b59be97e672ffbe6a2a831b9261939a5afba45e16aab5c2dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:06Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:06 crc kubenswrapper[4698]: I0224 10:18:06.537067 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-nn578" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4ee0bb1-125d-4852-a54d-7dadf6177545\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67e08c23594b195088f0a11823556880d9f809097ec231acf6c4ddbcf5c085b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9ngd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0c8bc2bc5ebfb2472863808bf33f95f8aa74ed45b546ed1a1b3be4883af700e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9ngd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-nn578\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:06Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:06 crc kubenswrapper[4698]: I0224 10:18:06.556429 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7mbk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17dd9ce8-b1ca-4810-85fe-9775919eb4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac059400b5a17e1f1dc36d2fe35b5c8ace2dad5326f3933873eae644e1786c54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgnjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7mbk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:06Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:06 crc kubenswrapper[4698]: I0224 10:18:06.569609 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rpnnm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17a1338b-6385-4795-9397-74316d6599d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7xll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7xll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:18:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rpnnm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:06Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:06 crc kubenswrapper[4698]: I0224 10:18:06.585482 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:06 crc kubenswrapper[4698]: I0224 10:18:06.585511 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:06 crc kubenswrapper[4698]: I0224 10:18:06.585519 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:06 crc kubenswrapper[4698]: I0224 10:18:06.585532 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:06 crc kubenswrapper[4698]: I0224 10:18:06.585540 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:06Z","lastTransitionTime":"2026-02-24T10:18:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:06 crc kubenswrapper[4698]: I0224 10:18:06.587842 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:06Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:06 crc kubenswrapper[4698]: I0224 10:18:06.599893 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b70223850a461f607af8055fb157db676ed4dd9537481c41f21b8b85dc955c6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:06Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:06 crc kubenswrapper[4698]: I0224 10:18:06.614229 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpnnm" Feb 24 10:18:06 crc kubenswrapper[4698]: I0224 10:18:06.614358 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:18:06 crc kubenswrapper[4698]: E0224 10:18:06.614482 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpnnm" podUID="17a1338b-6385-4795-9397-74316d6599d9" Feb 24 10:18:06 crc kubenswrapper[4698]: E0224 10:18:06.614662 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:18:06 crc kubenswrapper[4698]: I0224 10:18:06.616938 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e70623bb6b1c9ba54ae662592cd2861cea4181853f6595a595390c81820c287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://863cee3a2b2acf3e3138d4e13d27a2b4229d619661f97eab920e5a4ee7ae2c51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:06Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:06 crc kubenswrapper[4698]: I0224 10:18:06.633775 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:06Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:06 crc kubenswrapper[4698]: I0224 10:18:06.654741 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jlg97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90062989-bf1b-4479-89a0-f3bf0d438ac3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd966a1dd77be4accb00f38133ee9df9a0f98df5050d51996c9547a95c361cfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f570e20898252544de2e4987e3ec3baea2d46904749fc01664c969518d8babd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f570e20898252544de2e4987e3ec3baea2d46904749fc01664c969518d8babd6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86844171c4cdeecffa4831f9bba9b6d9c5eecbcc2220f880ccdb8819df60fa34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86844171c4cdeecffa4831f9bba9b6d9c5eecbcc2220f880ccdb8819df60fa34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42705a048e7832b1de855a97691620e572a7a7f38b90148e1cedd49003c649fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42705a048e7832b1de855a97691620e572a7a7f38b90148e1cedd49003c649fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5968e3b94b9d8996e9c4d4fdfab0576fcee049356dff5defd85f1a71ab652c41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5968e3b94b9d8996e9c4d4fdfab0576fcee049356dff5defd85f1a71ab652c41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05bd18aaa2469fc7380f98a513907e098a1cd45c794dae35894dc4caccaaeac8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05bd18aaa2469fc7380f98a513907e098a1cd45c794dae35894dc4caccaaeac8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c47b55214c6082bb9f8a18705983f9be95ef4c3b557d2d8f6cb8a33fa1fddd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c47b55214c6082bb9f8a18705983f9be95ef4c3b557d2d8f6cb8a33fa1fddd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jlg97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:06Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:06 crc kubenswrapper[4698]: I0224 10:18:06.678820 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"066df704-6981-4770-a647-df52a0da50a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60215d9a7dc3fbaa1b045a76c018c910f3748c5bef5325716e0a28844bc91ece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e27ae8c6aa803d58f6ff0252273d2fcbbee794c49a13fc54bfe6677b5aa6e07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7adc5b73bdd01b1e822308534c8848e154a1d05ed5367b971b59a99289387585\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2ec337c851d86c491d1ae5a667e4344ae4759f945b423d3a48838874a6eda20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://444da705b890c795bca82d2bd44ad5b71ed9bcc95a70ee5c92755679af31aa99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://096010abeb5f4fc1cf8ab2a1a3e50000365a449d0747081df923bde1be7e1213\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c4183b7a2d42eded3a4a62df9ef06d127a9a288a8e51010277cb370cf9d019e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c4183b7a2d42eded3a4a62df9ef06d127a9a288a8e51010277cb370cf9d019e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-24T10:18:05Z\\\",\\\"message\\\":\\\"5 6551 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-controller-manager/kube-controller-manager\\\\\\\"}\\\\nI0224 10:18:04.755926 6551 services_controller.go:360] Finished syncing service kube-controller-manager on namespace openshift-kube-controller-manager for network=default : 3.573648ms\\\\nI0224 10:18:04.756185 6551 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-apiserver/apiserver\\\\\\\"}\\\\nI0224 10:18:04.756205 6551 services_controller.go:360] Finished syncing service apiserver on namespace openshift-kube-apiserver for network=default : 3.903445ms\\\\nI0224 10:18:04.756473 6551 obj_retry.go:551] Creating *factory.egressNode crc took: 8.810484ms\\\\nI0224 10:18:04.756500 6551 factory.go:1336] Added *v1.Node event handler 7\\\\nI0224 10:18:04.756534 6551 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0224 10:18:04.756802 6551 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0224 10:18:04.756875 6551 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0224 10:18:04.756903 6551 ovnkube.go:599] Stopped ovnkube\\\\nI0224 10:18:04.756935 6551 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0224 10:18:04.756995 6551 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:18:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-mgh7p_openshift-ovn-kubernetes(066df704-6981-4770-a647-df52a0da50a0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1288272246b8937c2880153451d797fc3328749902e2491e60c8f8f086c85288\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://363eade2263b2108feaaf0620f7f1fd910effb90ce635e5b749b59b407618443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://363eade2263b2108feaaf0620f7f1fd910effb90ce635e5b749b59b407618443\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mgh7p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:06Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:06 crc kubenswrapper[4698]: I0224 10:18:06.689160 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:06 crc kubenswrapper[4698]: I0224 10:18:06.689244 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:06 crc kubenswrapper[4698]: I0224 10:18:06.689294 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:06 crc kubenswrapper[4698]: I0224 10:18:06.689318 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:06 crc kubenswrapper[4698]: I0224 10:18:06.689338 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:06Z","lastTransitionTime":"2026-02-24T10:18:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:06 crc kubenswrapper[4698]: I0224 10:18:06.694546 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mb4d7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc3c474c-e869-4b47-94c5-f1ab3ce3c843\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d49238acba0219497644e528a1e99906b8e7e5d4a61033354fa8b7b9708b5e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d8kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mb4d7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:06Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:06 crc kubenswrapper[4698]: I0224 10:18:06.710144 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bhrhk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d24b42-65c5-4a01-8f4a-6f970714ab76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93b33a3866385dfb6006f052ecde4b52df1dad342d6392f0935f548b610c26e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-knwn8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://892da9f80566a48c6ace1fb4d7a16d824aad789a4ae631728a01c22a8d7b04f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-knwn8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:18:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bhrhk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:06Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:06 crc kubenswrapper[4698]: I0224 10:18:06.732533 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34fd32d5-5aed-4abb-bf14-ab1b8b02b516\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b9d9ca2f4ccd094b55e3e27cef8afddae5dc7de81912aba64ca6a6671f14a35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42a2655047e1fb057b615781d8c2ccf50f62f2a70749ef8bb214d32edaba2b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e1bb75600de7e41c8a04ba010078c753b55d05aae7a18f945c2027ba48ee30c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa3d4a95fd60ff55d1850deb923135ed607172e7676a141a5d52e6cdd60b23bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64b39341e105fbe8aa9dc4c108f6ee8a2bff33568a205e32e639b8382ab2ccb2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T10:17:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0224 10:17:08.346350 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0224 10:17:08.346447 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0224 10:17:08.346900 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705878618/tls.crt::/tmp/serving-cert-3705878618/tls.key\\\\\\\"\\\\nI0224 10:17:08.624012 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0224 10:17:08.625525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0224 10:17:08.625540 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0224 10:17:08.625560 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0224 10:17:08.625565 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0224 10:17:08.629654 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0224 10:17:08.629666 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0224 10:17:08.629711 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:17:08.629725 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:17:08.629739 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0224 10:17:08.629749 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0224 10:17:08.629758 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0224 10:17:08.629766 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0224 10:17:08.630467 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://674ed085a7507742c61fdb7dae4678b08e315a3679788c5dcbb4df97cdc27c61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9e1b116db9c76dec99d1ac4af98e5ee081f2a171a19093ba5628b676356f34b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9e1b116db9c76dec99d1ac4af98e5ee081f2a171a19093ba5628b676356f34b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:16:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:16:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:16:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:06Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:06 crc kubenswrapper[4698]: I0224 10:18:06.748232 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-29rvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9cba56db-d46e-4a34-9863-47e4dce27ca5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f62a06c2933f02c75637172be87adadd015a2aad2750f553bb2e99c38fbec74b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fk9xv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-29rvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:06Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:06 crc kubenswrapper[4698]: I0224 10:18:06.792660 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:06 crc kubenswrapper[4698]: I0224 10:18:06.792990 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:06 crc kubenswrapper[4698]: I0224 10:18:06.793188 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:06 crc kubenswrapper[4698]: I0224 10:18:06.793440 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:06 crc kubenswrapper[4698]: I0224 10:18:06.793632 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:06Z","lastTransitionTime":"2026-02-24T10:18:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:06 crc kubenswrapper[4698]: I0224 10:18:06.896390 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:06 crc kubenswrapper[4698]: I0224 10:18:06.896448 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:06 crc kubenswrapper[4698]: I0224 10:18:06.896465 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:06 crc kubenswrapper[4698]: I0224 10:18:06.896490 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:06 crc kubenswrapper[4698]: I0224 10:18:06.896507 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:06Z","lastTransitionTime":"2026-02-24T10:18:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:06 crc kubenswrapper[4698]: I0224 10:18:06.998846 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:06 crc kubenswrapper[4698]: I0224 10:18:06.998891 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:06 crc kubenswrapper[4698]: I0224 10:18:06.998906 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:06 crc kubenswrapper[4698]: I0224 10:18:06.998926 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:06 crc kubenswrapper[4698]: I0224 10:18:06.998941 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:06Z","lastTransitionTime":"2026-02-24T10:18:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:07 crc kubenswrapper[4698]: I0224 10:18:07.060086 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/17a1338b-6385-4795-9397-74316d6599d9-metrics-certs\") pod \"network-metrics-daemon-rpnnm\" (UID: \"17a1338b-6385-4795-9397-74316d6599d9\") " pod="openshift-multus/network-metrics-daemon-rpnnm" Feb 24 10:18:07 crc kubenswrapper[4698]: E0224 10:18:07.060587 4698 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 24 10:18:07 crc kubenswrapper[4698]: E0224 10:18:07.060795 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/17a1338b-6385-4795-9397-74316d6599d9-metrics-certs podName:17a1338b-6385-4795-9397-74316d6599d9 nodeName:}" failed. No retries permitted until 2026-02-24 10:18:09.060772433 +0000 UTC m=+114.174386684 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/17a1338b-6385-4795-9397-74316d6599d9-metrics-certs") pod "network-metrics-daemon-rpnnm" (UID: "17a1338b-6385-4795-9397-74316d6599d9") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 24 10:18:07 crc kubenswrapper[4698]: I0224 10:18:07.101239 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:07 crc kubenswrapper[4698]: I0224 10:18:07.101317 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:07 crc kubenswrapper[4698]: I0224 10:18:07.101337 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:07 crc kubenswrapper[4698]: I0224 10:18:07.101365 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:07 crc kubenswrapper[4698]: I0224 10:18:07.101384 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:07Z","lastTransitionTime":"2026-02-24T10:18:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:07 crc kubenswrapper[4698]: I0224 10:18:07.205797 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:07 crc kubenswrapper[4698]: I0224 10:18:07.205883 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:07 crc kubenswrapper[4698]: I0224 10:18:07.205923 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:07 crc kubenswrapper[4698]: I0224 10:18:07.205959 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:07 crc kubenswrapper[4698]: I0224 10:18:07.205983 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:07Z","lastTransitionTime":"2026-02-24T10:18:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:07 crc kubenswrapper[4698]: I0224 10:18:07.308955 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:07 crc kubenswrapper[4698]: I0224 10:18:07.309002 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:07 crc kubenswrapper[4698]: I0224 10:18:07.309014 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:07 crc kubenswrapper[4698]: I0224 10:18:07.309032 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:07 crc kubenswrapper[4698]: I0224 10:18:07.309044 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:07Z","lastTransitionTime":"2026-02-24T10:18:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:07 crc kubenswrapper[4698]: I0224 10:18:07.411453 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:07 crc kubenswrapper[4698]: I0224 10:18:07.411493 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:07 crc kubenswrapper[4698]: I0224 10:18:07.411505 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:07 crc kubenswrapper[4698]: I0224 10:18:07.411520 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:07 crc kubenswrapper[4698]: I0224 10:18:07.411532 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:07Z","lastTransitionTime":"2026-02-24T10:18:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:07 crc kubenswrapper[4698]: I0224 10:18:07.466538 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:18:07 crc kubenswrapper[4698]: I0224 10:18:07.466656 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:18:07 crc kubenswrapper[4698]: I0224 10:18:07.466747 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:18:07 crc kubenswrapper[4698]: E0224 10:18:07.466811 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:18:23.466779977 +0000 UTC m=+128.580394258 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:18:07 crc kubenswrapper[4698]: E0224 10:18:07.466899 4698 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 24 10:18:07 crc kubenswrapper[4698]: E0224 10:18:07.466971 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-24 10:18:23.466955641 +0000 UTC m=+128.580569892 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 24 10:18:07 crc kubenswrapper[4698]: E0224 10:18:07.466992 4698 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 24 10:18:07 crc kubenswrapper[4698]: E0224 10:18:07.467147 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-24 10:18:23.467109435 +0000 UTC m=+128.580723736 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 24 10:18:07 crc kubenswrapper[4698]: I0224 10:18:07.514351 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:07 crc kubenswrapper[4698]: I0224 10:18:07.514422 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:07 crc kubenswrapper[4698]: I0224 10:18:07.514441 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:07 crc kubenswrapper[4698]: I0224 10:18:07.514466 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:07 crc kubenswrapper[4698]: I0224 10:18:07.514483 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:07Z","lastTransitionTime":"2026-02-24T10:18:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:07 crc kubenswrapper[4698]: I0224 10:18:07.568312 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:18:07 crc kubenswrapper[4698]: I0224 10:18:07.568417 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:18:07 crc kubenswrapper[4698]: E0224 10:18:07.568562 4698 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 24 10:18:07 crc kubenswrapper[4698]: E0224 10:18:07.568603 4698 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 24 10:18:07 crc kubenswrapper[4698]: E0224 10:18:07.568620 4698 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 10:18:07 crc kubenswrapper[4698]: E0224 10:18:07.568631 4698 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 24 10:18:07 crc kubenswrapper[4698]: E0224 10:18:07.568658 4698 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 24 10:18:07 crc kubenswrapper[4698]: E0224 10:18:07.568680 4698 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 10:18:07 crc kubenswrapper[4698]: E0224 10:18:07.568686 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-24 10:18:23.568667866 +0000 UTC m=+128.682282117 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 10:18:07 crc kubenswrapper[4698]: E0224 10:18:07.568748 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-24 10:18:23.568726847 +0000 UTC m=+128.682341128 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 10:18:07 crc kubenswrapper[4698]: I0224 10:18:07.614298 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:18:07 crc kubenswrapper[4698]: I0224 10:18:07.614379 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:18:07 crc kubenswrapper[4698]: E0224 10:18:07.614482 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:18:07 crc kubenswrapper[4698]: E0224 10:18:07.614611 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:18:07 crc kubenswrapper[4698]: I0224 10:18:07.616636 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:07 crc kubenswrapper[4698]: I0224 10:18:07.616689 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:07 crc kubenswrapper[4698]: I0224 10:18:07.616709 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:07 crc kubenswrapper[4698]: I0224 10:18:07.616732 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:07 crc kubenswrapper[4698]: I0224 10:18:07.616750 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:07Z","lastTransitionTime":"2026-02-24T10:18:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:07 crc kubenswrapper[4698]: I0224 10:18:07.720430 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:07 crc kubenswrapper[4698]: I0224 10:18:07.720537 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:07 crc kubenswrapper[4698]: I0224 10:18:07.720550 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:07 crc kubenswrapper[4698]: I0224 10:18:07.720570 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:07 crc kubenswrapper[4698]: I0224 10:18:07.720583 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:07Z","lastTransitionTime":"2026-02-24T10:18:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:07 crc kubenswrapper[4698]: I0224 10:18:07.746624 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:07 crc kubenswrapper[4698]: I0224 10:18:07.746672 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:07 crc kubenswrapper[4698]: I0224 10:18:07.746721 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:07 crc kubenswrapper[4698]: I0224 10:18:07.746745 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:07 crc kubenswrapper[4698]: I0224 10:18:07.746758 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:07Z","lastTransitionTime":"2026-02-24T10:18:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:07 crc kubenswrapper[4698]: E0224 10:18:07.763946 4698 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b118f46-32f0-479c-9931-37b2bbb76922\\\",\\\"systemUUID\\\":\\\"b9d2441b-c8c3-476a-9c48-bba682d9b98e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:07Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:07 crc kubenswrapper[4698]: I0224 10:18:07.767452 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:07 crc kubenswrapper[4698]: I0224 10:18:07.767491 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:07 crc kubenswrapper[4698]: I0224 10:18:07.767505 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:07 crc kubenswrapper[4698]: I0224 10:18:07.767525 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:07 crc kubenswrapper[4698]: I0224 10:18:07.767539 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:07Z","lastTransitionTime":"2026-02-24T10:18:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:07 crc kubenswrapper[4698]: E0224 10:18:07.784758 4698 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b118f46-32f0-479c-9931-37b2bbb76922\\\",\\\"systemUUID\\\":\\\"b9d2441b-c8c3-476a-9c48-bba682d9b98e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:07Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:07 crc kubenswrapper[4698]: I0224 10:18:07.789352 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:07 crc kubenswrapper[4698]: I0224 10:18:07.789383 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:07 crc kubenswrapper[4698]: I0224 10:18:07.789392 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:07 crc kubenswrapper[4698]: I0224 10:18:07.789407 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:07 crc kubenswrapper[4698]: I0224 10:18:07.789418 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:07Z","lastTransitionTime":"2026-02-24T10:18:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:07 crc kubenswrapper[4698]: E0224 10:18:07.801337 4698 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b118f46-32f0-479c-9931-37b2bbb76922\\\",\\\"systemUUID\\\":\\\"b9d2441b-c8c3-476a-9c48-bba682d9b98e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:07Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:07 crc kubenswrapper[4698]: I0224 10:18:07.806246 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:07 crc kubenswrapper[4698]: I0224 10:18:07.806278 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:07 crc kubenswrapper[4698]: I0224 10:18:07.806287 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:07 crc kubenswrapper[4698]: I0224 10:18:07.806302 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:07 crc kubenswrapper[4698]: I0224 10:18:07.806313 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:07Z","lastTransitionTime":"2026-02-24T10:18:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:07 crc kubenswrapper[4698]: E0224 10:18:07.824377 4698 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b118f46-32f0-479c-9931-37b2bbb76922\\\",\\\"systemUUID\\\":\\\"b9d2441b-c8c3-476a-9c48-bba682d9b98e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:07Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:07 crc kubenswrapper[4698]: I0224 10:18:07.828892 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:07 crc kubenswrapper[4698]: I0224 10:18:07.828959 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:07 crc kubenswrapper[4698]: I0224 10:18:07.828980 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:07 crc kubenswrapper[4698]: I0224 10:18:07.829006 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:07 crc kubenswrapper[4698]: I0224 10:18:07.829025 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:07Z","lastTransitionTime":"2026-02-24T10:18:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:07 crc kubenswrapper[4698]: E0224 10:18:07.842418 4698 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:07Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b118f46-32f0-479c-9931-37b2bbb76922\\\",\\\"systemUUID\\\":\\\"b9d2441b-c8c3-476a-9c48-bba682d9b98e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:07Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:07 crc kubenswrapper[4698]: E0224 10:18:07.842602 4698 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 24 10:18:07 crc kubenswrapper[4698]: I0224 10:18:07.844684 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:07 crc kubenswrapper[4698]: I0224 10:18:07.844711 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:07 crc kubenswrapper[4698]: I0224 10:18:07.844720 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:07 crc kubenswrapper[4698]: I0224 10:18:07.844735 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:07 crc kubenswrapper[4698]: I0224 10:18:07.844746 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:07Z","lastTransitionTime":"2026-02-24T10:18:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:07 crc kubenswrapper[4698]: I0224 10:18:07.946971 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:07 crc kubenswrapper[4698]: I0224 10:18:07.947011 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:07 crc kubenswrapper[4698]: I0224 10:18:07.947023 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:07 crc kubenswrapper[4698]: I0224 10:18:07.947040 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:07 crc kubenswrapper[4698]: I0224 10:18:07.947052 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:07Z","lastTransitionTime":"2026-02-24T10:18:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:08 crc kubenswrapper[4698]: I0224 10:18:08.050497 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:08 crc kubenswrapper[4698]: I0224 10:18:08.050557 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:08 crc kubenswrapper[4698]: I0224 10:18:08.050572 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:08 crc kubenswrapper[4698]: I0224 10:18:08.050596 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:08 crc kubenswrapper[4698]: I0224 10:18:08.050610 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:08Z","lastTransitionTime":"2026-02-24T10:18:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:08 crc kubenswrapper[4698]: I0224 10:18:08.153705 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:08 crc kubenswrapper[4698]: I0224 10:18:08.153786 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:08 crc kubenswrapper[4698]: I0224 10:18:08.153810 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:08 crc kubenswrapper[4698]: I0224 10:18:08.153847 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:08 crc kubenswrapper[4698]: I0224 10:18:08.153871 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:08Z","lastTransitionTime":"2026-02-24T10:18:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:08 crc kubenswrapper[4698]: I0224 10:18:08.256489 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:08 crc kubenswrapper[4698]: I0224 10:18:08.256531 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:08 crc kubenswrapper[4698]: I0224 10:18:08.256544 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:08 crc kubenswrapper[4698]: I0224 10:18:08.256561 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:08 crc kubenswrapper[4698]: I0224 10:18:08.256575 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:08Z","lastTransitionTime":"2026-02-24T10:18:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:08 crc kubenswrapper[4698]: I0224 10:18:08.359043 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:08 crc kubenswrapper[4698]: I0224 10:18:08.359088 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:08 crc kubenswrapper[4698]: I0224 10:18:08.359098 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:08 crc kubenswrapper[4698]: I0224 10:18:08.359116 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:08 crc kubenswrapper[4698]: I0224 10:18:08.359145 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:08Z","lastTransitionTime":"2026-02-24T10:18:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:08 crc kubenswrapper[4698]: I0224 10:18:08.462806 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:08 crc kubenswrapper[4698]: I0224 10:18:08.462859 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:08 crc kubenswrapper[4698]: I0224 10:18:08.462878 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:08 crc kubenswrapper[4698]: I0224 10:18:08.462901 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:08 crc kubenswrapper[4698]: I0224 10:18:08.462916 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:08Z","lastTransitionTime":"2026-02-24T10:18:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:08 crc kubenswrapper[4698]: I0224 10:18:08.565711 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:08 crc kubenswrapper[4698]: I0224 10:18:08.566048 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:08 crc kubenswrapper[4698]: I0224 10:18:08.566059 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:08 crc kubenswrapper[4698]: I0224 10:18:08.566074 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:08 crc kubenswrapper[4698]: I0224 10:18:08.566083 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:08Z","lastTransitionTime":"2026-02-24T10:18:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:08 crc kubenswrapper[4698]: I0224 10:18:08.613832 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:18:08 crc kubenswrapper[4698]: I0224 10:18:08.613917 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpnnm" Feb 24 10:18:08 crc kubenswrapper[4698]: E0224 10:18:08.613971 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:18:08 crc kubenswrapper[4698]: E0224 10:18:08.614033 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpnnm" podUID="17a1338b-6385-4795-9397-74316d6599d9" Feb 24 10:18:08 crc kubenswrapper[4698]: I0224 10:18:08.668327 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:08 crc kubenswrapper[4698]: I0224 10:18:08.668406 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:08 crc kubenswrapper[4698]: I0224 10:18:08.668434 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:08 crc kubenswrapper[4698]: I0224 10:18:08.668465 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:08 crc kubenswrapper[4698]: I0224 10:18:08.668488 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:08Z","lastTransitionTime":"2026-02-24T10:18:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:08 crc kubenswrapper[4698]: I0224 10:18:08.771682 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:08 crc kubenswrapper[4698]: I0224 10:18:08.771762 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:08 crc kubenswrapper[4698]: I0224 10:18:08.771780 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:08 crc kubenswrapper[4698]: I0224 10:18:08.771803 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:08 crc kubenswrapper[4698]: I0224 10:18:08.771820 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:08Z","lastTransitionTime":"2026-02-24T10:18:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:08 crc kubenswrapper[4698]: I0224 10:18:08.877617 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:08 crc kubenswrapper[4698]: I0224 10:18:08.877677 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:08 crc kubenswrapper[4698]: I0224 10:18:08.877691 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:08 crc kubenswrapper[4698]: I0224 10:18:08.877712 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:08 crc kubenswrapper[4698]: I0224 10:18:08.877730 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:08Z","lastTransitionTime":"2026-02-24T10:18:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:08 crc kubenswrapper[4698]: I0224 10:18:08.980142 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:08 crc kubenswrapper[4698]: I0224 10:18:08.980237 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:08 crc kubenswrapper[4698]: I0224 10:18:08.980294 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:08 crc kubenswrapper[4698]: I0224 10:18:08.980358 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:08 crc kubenswrapper[4698]: I0224 10:18:08.980387 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:08Z","lastTransitionTime":"2026-02-24T10:18:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:09 crc kubenswrapper[4698]: I0224 10:18:09.083522 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:09 crc kubenswrapper[4698]: I0224 10:18:09.083576 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:09 crc kubenswrapper[4698]: I0224 10:18:09.083589 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:09 crc kubenswrapper[4698]: I0224 10:18:09.083608 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:09 crc kubenswrapper[4698]: I0224 10:18:09.083622 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:09Z","lastTransitionTime":"2026-02-24T10:18:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:09 crc kubenswrapper[4698]: I0224 10:18:09.086903 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/17a1338b-6385-4795-9397-74316d6599d9-metrics-certs\") pod \"network-metrics-daemon-rpnnm\" (UID: \"17a1338b-6385-4795-9397-74316d6599d9\") " pod="openshift-multus/network-metrics-daemon-rpnnm" Feb 24 10:18:09 crc kubenswrapper[4698]: E0224 10:18:09.087038 4698 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 24 10:18:09 crc kubenswrapper[4698]: E0224 10:18:09.087115 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/17a1338b-6385-4795-9397-74316d6599d9-metrics-certs podName:17a1338b-6385-4795-9397-74316d6599d9 nodeName:}" failed. No retries permitted until 2026-02-24 10:18:13.087094685 +0000 UTC m=+118.200708936 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/17a1338b-6385-4795-9397-74316d6599d9-metrics-certs") pod "network-metrics-daemon-rpnnm" (UID: "17a1338b-6385-4795-9397-74316d6599d9") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 24 10:18:09 crc kubenswrapper[4698]: I0224 10:18:09.186158 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:09 crc kubenswrapper[4698]: I0224 10:18:09.186218 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:09 crc kubenswrapper[4698]: I0224 10:18:09.186228 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:09 crc kubenswrapper[4698]: I0224 10:18:09.186244 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:09 crc kubenswrapper[4698]: I0224 10:18:09.186253 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:09Z","lastTransitionTime":"2026-02-24T10:18:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:09 crc kubenswrapper[4698]: I0224 10:18:09.289841 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:09 crc kubenswrapper[4698]: I0224 10:18:09.289910 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:09 crc kubenswrapper[4698]: I0224 10:18:09.289928 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:09 crc kubenswrapper[4698]: I0224 10:18:09.289952 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:09 crc kubenswrapper[4698]: I0224 10:18:09.289969 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:09Z","lastTransitionTime":"2026-02-24T10:18:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:09 crc kubenswrapper[4698]: I0224 10:18:09.392911 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:09 crc kubenswrapper[4698]: I0224 10:18:09.393068 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:09 crc kubenswrapper[4698]: I0224 10:18:09.393096 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:09 crc kubenswrapper[4698]: I0224 10:18:09.393128 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:09 crc kubenswrapper[4698]: I0224 10:18:09.393153 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:09Z","lastTransitionTime":"2026-02-24T10:18:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:09 crc kubenswrapper[4698]: I0224 10:18:09.495030 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:09 crc kubenswrapper[4698]: I0224 10:18:09.495073 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:09 crc kubenswrapper[4698]: I0224 10:18:09.495086 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:09 crc kubenswrapper[4698]: I0224 10:18:09.495103 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:09 crc kubenswrapper[4698]: I0224 10:18:09.495114 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:09Z","lastTransitionTime":"2026-02-24T10:18:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:09 crc kubenswrapper[4698]: I0224 10:18:09.598069 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:09 crc kubenswrapper[4698]: I0224 10:18:09.598120 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:09 crc kubenswrapper[4698]: I0224 10:18:09.598132 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:09 crc kubenswrapper[4698]: I0224 10:18:09.598149 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:09 crc kubenswrapper[4698]: I0224 10:18:09.598159 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:09Z","lastTransitionTime":"2026-02-24T10:18:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:09 crc kubenswrapper[4698]: I0224 10:18:09.614761 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:18:09 crc kubenswrapper[4698]: I0224 10:18:09.614772 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:18:09 crc kubenswrapper[4698]: E0224 10:18:09.614944 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:18:09 crc kubenswrapper[4698]: E0224 10:18:09.615019 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:18:09 crc kubenswrapper[4698]: I0224 10:18:09.701017 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:09 crc kubenswrapper[4698]: I0224 10:18:09.701044 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:09 crc kubenswrapper[4698]: I0224 10:18:09.701052 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:09 crc kubenswrapper[4698]: I0224 10:18:09.701065 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:09 crc kubenswrapper[4698]: I0224 10:18:09.701073 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:09Z","lastTransitionTime":"2026-02-24T10:18:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:09 crc kubenswrapper[4698]: I0224 10:18:09.803811 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:09 crc kubenswrapper[4698]: I0224 10:18:09.803862 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:09 crc kubenswrapper[4698]: I0224 10:18:09.803874 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:09 crc kubenswrapper[4698]: I0224 10:18:09.803890 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:09 crc kubenswrapper[4698]: I0224 10:18:09.803900 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:09Z","lastTransitionTime":"2026-02-24T10:18:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:09 crc kubenswrapper[4698]: I0224 10:18:09.907321 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:09 crc kubenswrapper[4698]: I0224 10:18:09.907527 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:09 crc kubenswrapper[4698]: I0224 10:18:09.907547 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:09 crc kubenswrapper[4698]: I0224 10:18:09.907572 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:09 crc kubenswrapper[4698]: I0224 10:18:09.907589 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:09Z","lastTransitionTime":"2026-02-24T10:18:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:10 crc kubenswrapper[4698]: I0224 10:18:10.010424 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:10 crc kubenswrapper[4698]: I0224 10:18:10.010478 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:10 crc kubenswrapper[4698]: I0224 10:18:10.010498 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:10 crc kubenswrapper[4698]: I0224 10:18:10.010523 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:10 crc kubenswrapper[4698]: I0224 10:18:10.010543 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:10Z","lastTransitionTime":"2026-02-24T10:18:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:10 crc kubenswrapper[4698]: I0224 10:18:10.127675 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:10 crc kubenswrapper[4698]: I0224 10:18:10.127754 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:10 crc kubenswrapper[4698]: I0224 10:18:10.127798 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:10 crc kubenswrapper[4698]: I0224 10:18:10.127835 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:10 crc kubenswrapper[4698]: I0224 10:18:10.127895 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:10Z","lastTransitionTime":"2026-02-24T10:18:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:10 crc kubenswrapper[4698]: I0224 10:18:10.231564 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:10 crc kubenswrapper[4698]: I0224 10:18:10.231609 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:10 crc kubenswrapper[4698]: I0224 10:18:10.231621 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:10 crc kubenswrapper[4698]: I0224 10:18:10.231639 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:10 crc kubenswrapper[4698]: I0224 10:18:10.231654 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:10Z","lastTransitionTime":"2026-02-24T10:18:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:10 crc kubenswrapper[4698]: I0224 10:18:10.334063 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:10 crc kubenswrapper[4698]: I0224 10:18:10.334124 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:10 crc kubenswrapper[4698]: I0224 10:18:10.334143 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:10 crc kubenswrapper[4698]: I0224 10:18:10.334169 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:10 crc kubenswrapper[4698]: I0224 10:18:10.334187 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:10Z","lastTransitionTime":"2026-02-24T10:18:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:10 crc kubenswrapper[4698]: I0224 10:18:10.437533 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:10 crc kubenswrapper[4698]: I0224 10:18:10.437596 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:10 crc kubenswrapper[4698]: I0224 10:18:10.437613 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:10 crc kubenswrapper[4698]: I0224 10:18:10.437637 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:10 crc kubenswrapper[4698]: I0224 10:18:10.437658 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:10Z","lastTransitionTime":"2026-02-24T10:18:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:10 crc kubenswrapper[4698]: I0224 10:18:10.540929 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:10 crc kubenswrapper[4698]: I0224 10:18:10.540989 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:10 crc kubenswrapper[4698]: I0224 10:18:10.541008 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:10 crc kubenswrapper[4698]: I0224 10:18:10.541031 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:10 crc kubenswrapper[4698]: I0224 10:18:10.541048 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:10Z","lastTransitionTime":"2026-02-24T10:18:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:10 crc kubenswrapper[4698]: I0224 10:18:10.613819 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:18:10 crc kubenswrapper[4698]: I0224 10:18:10.613837 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpnnm" Feb 24 10:18:10 crc kubenswrapper[4698]: E0224 10:18:10.614035 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:18:10 crc kubenswrapper[4698]: E0224 10:18:10.614140 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpnnm" podUID="17a1338b-6385-4795-9397-74316d6599d9" Feb 24 10:18:10 crc kubenswrapper[4698]: I0224 10:18:10.643496 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:10 crc kubenswrapper[4698]: I0224 10:18:10.643556 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:10 crc kubenswrapper[4698]: I0224 10:18:10.643573 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:10 crc kubenswrapper[4698]: I0224 10:18:10.643599 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:10 crc kubenswrapper[4698]: I0224 10:18:10.643618 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:10Z","lastTransitionTime":"2026-02-24T10:18:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:10 crc kubenswrapper[4698]: I0224 10:18:10.745923 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:10 crc kubenswrapper[4698]: I0224 10:18:10.745997 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:10 crc kubenswrapper[4698]: I0224 10:18:10.746017 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:10 crc kubenswrapper[4698]: I0224 10:18:10.746048 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:10 crc kubenswrapper[4698]: I0224 10:18:10.746065 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:10Z","lastTransitionTime":"2026-02-24T10:18:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:10 crc kubenswrapper[4698]: I0224 10:18:10.849749 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:10 crc kubenswrapper[4698]: I0224 10:18:10.849806 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:10 crc kubenswrapper[4698]: I0224 10:18:10.849822 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:10 crc kubenswrapper[4698]: I0224 10:18:10.849847 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:10 crc kubenswrapper[4698]: I0224 10:18:10.849864 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:10Z","lastTransitionTime":"2026-02-24T10:18:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:10 crc kubenswrapper[4698]: I0224 10:18:10.953691 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:10 crc kubenswrapper[4698]: I0224 10:18:10.953748 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:10 crc kubenswrapper[4698]: I0224 10:18:10.953765 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:10 crc kubenswrapper[4698]: I0224 10:18:10.953789 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:10 crc kubenswrapper[4698]: I0224 10:18:10.953811 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:10Z","lastTransitionTime":"2026-02-24T10:18:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:11 crc kubenswrapper[4698]: I0224 10:18:11.057110 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:11 crc kubenswrapper[4698]: I0224 10:18:11.057190 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:11 crc kubenswrapper[4698]: I0224 10:18:11.057208 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:11 crc kubenswrapper[4698]: I0224 10:18:11.057234 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:11 crc kubenswrapper[4698]: I0224 10:18:11.057253 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:11Z","lastTransitionTime":"2026-02-24T10:18:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:11 crc kubenswrapper[4698]: I0224 10:18:11.159999 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:11 crc kubenswrapper[4698]: I0224 10:18:11.160064 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:11 crc kubenswrapper[4698]: I0224 10:18:11.160081 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:11 crc kubenswrapper[4698]: I0224 10:18:11.160099 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:11 crc kubenswrapper[4698]: I0224 10:18:11.160110 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:11Z","lastTransitionTime":"2026-02-24T10:18:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:11 crc kubenswrapper[4698]: I0224 10:18:11.263724 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:11 crc kubenswrapper[4698]: I0224 10:18:11.263787 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:11 crc kubenswrapper[4698]: I0224 10:18:11.263804 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:11 crc kubenswrapper[4698]: I0224 10:18:11.263829 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:11 crc kubenswrapper[4698]: I0224 10:18:11.263849 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:11Z","lastTransitionTime":"2026-02-24T10:18:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:11 crc kubenswrapper[4698]: I0224 10:18:11.368607 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:11 crc kubenswrapper[4698]: I0224 10:18:11.368672 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:11 crc kubenswrapper[4698]: I0224 10:18:11.368695 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:11 crc kubenswrapper[4698]: I0224 10:18:11.368725 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:11 crc kubenswrapper[4698]: I0224 10:18:11.368745 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:11Z","lastTransitionTime":"2026-02-24T10:18:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:11 crc kubenswrapper[4698]: I0224 10:18:11.471394 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:11 crc kubenswrapper[4698]: I0224 10:18:11.471468 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:11 crc kubenswrapper[4698]: I0224 10:18:11.471486 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:11 crc kubenswrapper[4698]: I0224 10:18:11.471510 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:11 crc kubenswrapper[4698]: I0224 10:18:11.471553 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:11Z","lastTransitionTime":"2026-02-24T10:18:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:11 crc kubenswrapper[4698]: I0224 10:18:11.574782 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:11 crc kubenswrapper[4698]: I0224 10:18:11.574844 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:11 crc kubenswrapper[4698]: I0224 10:18:11.574862 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:11 crc kubenswrapper[4698]: I0224 10:18:11.574885 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:11 crc kubenswrapper[4698]: I0224 10:18:11.574902 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:11Z","lastTransitionTime":"2026-02-24T10:18:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:11 crc kubenswrapper[4698]: I0224 10:18:11.614450 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:18:11 crc kubenswrapper[4698]: I0224 10:18:11.614510 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:18:11 crc kubenswrapper[4698]: E0224 10:18:11.614632 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:18:11 crc kubenswrapper[4698]: E0224 10:18:11.615008 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:18:11 crc kubenswrapper[4698]: I0224 10:18:11.677645 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:11 crc kubenswrapper[4698]: I0224 10:18:11.677721 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:11 crc kubenswrapper[4698]: I0224 10:18:11.677758 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:11 crc kubenswrapper[4698]: I0224 10:18:11.677782 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:11 crc kubenswrapper[4698]: I0224 10:18:11.677800 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:11Z","lastTransitionTime":"2026-02-24T10:18:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:11 crc kubenswrapper[4698]: I0224 10:18:11.781251 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:11 crc kubenswrapper[4698]: I0224 10:18:11.781354 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:11 crc kubenswrapper[4698]: I0224 10:18:11.781376 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:11 crc kubenswrapper[4698]: I0224 10:18:11.781399 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:11 crc kubenswrapper[4698]: I0224 10:18:11.781418 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:11Z","lastTransitionTime":"2026-02-24T10:18:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:11 crc kubenswrapper[4698]: I0224 10:18:11.884338 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:11 crc kubenswrapper[4698]: I0224 10:18:11.884405 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:11 crc kubenswrapper[4698]: I0224 10:18:11.884423 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:11 crc kubenswrapper[4698]: I0224 10:18:11.884448 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:11 crc kubenswrapper[4698]: I0224 10:18:11.884465 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:11Z","lastTransitionTime":"2026-02-24T10:18:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:11 crc kubenswrapper[4698]: I0224 10:18:11.987800 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:11 crc kubenswrapper[4698]: I0224 10:18:11.987867 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:11 crc kubenswrapper[4698]: I0224 10:18:11.987885 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:11 crc kubenswrapper[4698]: I0224 10:18:11.987908 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:11 crc kubenswrapper[4698]: I0224 10:18:11.987925 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:11Z","lastTransitionTime":"2026-02-24T10:18:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:12 crc kubenswrapper[4698]: I0224 10:18:12.091574 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:12 crc kubenswrapper[4698]: I0224 10:18:12.091640 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:12 crc kubenswrapper[4698]: I0224 10:18:12.091659 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:12 crc kubenswrapper[4698]: I0224 10:18:12.091684 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:12 crc kubenswrapper[4698]: I0224 10:18:12.091703 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:12Z","lastTransitionTime":"2026-02-24T10:18:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:12 crc kubenswrapper[4698]: I0224 10:18:12.194764 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:12 crc kubenswrapper[4698]: I0224 10:18:12.194828 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:12 crc kubenswrapper[4698]: I0224 10:18:12.194851 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:12 crc kubenswrapper[4698]: I0224 10:18:12.194881 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:12 crc kubenswrapper[4698]: I0224 10:18:12.194905 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:12Z","lastTransitionTime":"2026-02-24T10:18:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:12 crc kubenswrapper[4698]: I0224 10:18:12.297764 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:12 crc kubenswrapper[4698]: I0224 10:18:12.297818 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:12 crc kubenswrapper[4698]: I0224 10:18:12.297835 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:12 crc kubenswrapper[4698]: I0224 10:18:12.297858 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:12 crc kubenswrapper[4698]: I0224 10:18:12.297875 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:12Z","lastTransitionTime":"2026-02-24T10:18:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:12 crc kubenswrapper[4698]: I0224 10:18:12.402397 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:12 crc kubenswrapper[4698]: I0224 10:18:12.402469 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:12 crc kubenswrapper[4698]: I0224 10:18:12.402492 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:12 crc kubenswrapper[4698]: I0224 10:18:12.402525 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:12 crc kubenswrapper[4698]: I0224 10:18:12.402547 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:12Z","lastTransitionTime":"2026-02-24T10:18:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:12 crc kubenswrapper[4698]: I0224 10:18:12.504642 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:12 crc kubenswrapper[4698]: I0224 10:18:12.504676 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:12 crc kubenswrapper[4698]: I0224 10:18:12.504686 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:12 crc kubenswrapper[4698]: I0224 10:18:12.504700 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:12 crc kubenswrapper[4698]: I0224 10:18:12.504711 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:12Z","lastTransitionTime":"2026-02-24T10:18:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:12 crc kubenswrapper[4698]: I0224 10:18:12.608111 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:12 crc kubenswrapper[4698]: I0224 10:18:12.608159 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:12 crc kubenswrapper[4698]: I0224 10:18:12.608173 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:12 crc kubenswrapper[4698]: I0224 10:18:12.608194 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:12 crc kubenswrapper[4698]: I0224 10:18:12.608207 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:12Z","lastTransitionTime":"2026-02-24T10:18:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:12 crc kubenswrapper[4698]: I0224 10:18:12.613666 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:18:12 crc kubenswrapper[4698]: I0224 10:18:12.613735 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpnnm" Feb 24 10:18:12 crc kubenswrapper[4698]: E0224 10:18:12.613785 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:18:12 crc kubenswrapper[4698]: E0224 10:18:12.613953 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpnnm" podUID="17a1338b-6385-4795-9397-74316d6599d9" Feb 24 10:18:12 crc kubenswrapper[4698]: I0224 10:18:12.711752 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:12 crc kubenswrapper[4698]: I0224 10:18:12.711812 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:12 crc kubenswrapper[4698]: I0224 10:18:12.711829 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:12 crc kubenswrapper[4698]: I0224 10:18:12.711849 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:12 crc kubenswrapper[4698]: I0224 10:18:12.711863 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:12Z","lastTransitionTime":"2026-02-24T10:18:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:12 crc kubenswrapper[4698]: I0224 10:18:12.815310 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:12 crc kubenswrapper[4698]: I0224 10:18:12.815389 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:12 crc kubenswrapper[4698]: I0224 10:18:12.815413 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:12 crc kubenswrapper[4698]: I0224 10:18:12.815444 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:12 crc kubenswrapper[4698]: I0224 10:18:12.815466 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:12Z","lastTransitionTime":"2026-02-24T10:18:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:12 crc kubenswrapper[4698]: I0224 10:18:12.917949 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:12 crc kubenswrapper[4698]: I0224 10:18:12.917996 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:12 crc kubenswrapper[4698]: I0224 10:18:12.918012 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:12 crc kubenswrapper[4698]: I0224 10:18:12.918035 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:12 crc kubenswrapper[4698]: I0224 10:18:12.918052 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:12Z","lastTransitionTime":"2026-02-24T10:18:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:13 crc kubenswrapper[4698]: I0224 10:18:13.020368 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:13 crc kubenswrapper[4698]: I0224 10:18:13.020412 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:13 crc kubenswrapper[4698]: I0224 10:18:13.020423 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:13 crc kubenswrapper[4698]: I0224 10:18:13.020441 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:13 crc kubenswrapper[4698]: I0224 10:18:13.020453 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:13Z","lastTransitionTime":"2026-02-24T10:18:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:13 crc kubenswrapper[4698]: I0224 10:18:13.123718 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:13 crc kubenswrapper[4698]: I0224 10:18:13.123761 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:13 crc kubenswrapper[4698]: I0224 10:18:13.123769 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:13 crc kubenswrapper[4698]: I0224 10:18:13.123783 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:13 crc kubenswrapper[4698]: I0224 10:18:13.123793 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:13Z","lastTransitionTime":"2026-02-24T10:18:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:13 crc kubenswrapper[4698]: I0224 10:18:13.155533 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/17a1338b-6385-4795-9397-74316d6599d9-metrics-certs\") pod \"network-metrics-daemon-rpnnm\" (UID: \"17a1338b-6385-4795-9397-74316d6599d9\") " pod="openshift-multus/network-metrics-daemon-rpnnm" Feb 24 10:18:13 crc kubenswrapper[4698]: E0224 10:18:13.155680 4698 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 24 10:18:13 crc kubenswrapper[4698]: E0224 10:18:13.155736 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/17a1338b-6385-4795-9397-74316d6599d9-metrics-certs podName:17a1338b-6385-4795-9397-74316d6599d9 nodeName:}" failed. No retries permitted until 2026-02-24 10:18:21.155719323 +0000 UTC m=+126.269333574 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/17a1338b-6385-4795-9397-74316d6599d9-metrics-certs") pod "network-metrics-daemon-rpnnm" (UID: "17a1338b-6385-4795-9397-74316d6599d9") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 24 10:18:13 crc kubenswrapper[4698]: I0224 10:18:13.226899 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:13 crc kubenswrapper[4698]: I0224 10:18:13.226984 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:13 crc kubenswrapper[4698]: I0224 10:18:13.227009 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:13 crc kubenswrapper[4698]: I0224 10:18:13.227041 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:13 crc kubenswrapper[4698]: I0224 10:18:13.227064 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:13Z","lastTransitionTime":"2026-02-24T10:18:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:13 crc kubenswrapper[4698]: I0224 10:18:13.331314 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:13 crc kubenswrapper[4698]: I0224 10:18:13.331398 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:13 crc kubenswrapper[4698]: I0224 10:18:13.331422 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:13 crc kubenswrapper[4698]: I0224 10:18:13.331455 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:13 crc kubenswrapper[4698]: I0224 10:18:13.331478 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:13Z","lastTransitionTime":"2026-02-24T10:18:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:13 crc kubenswrapper[4698]: I0224 10:18:13.433863 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:13 crc kubenswrapper[4698]: I0224 10:18:13.433915 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:13 crc kubenswrapper[4698]: I0224 10:18:13.433933 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:13 crc kubenswrapper[4698]: I0224 10:18:13.433956 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:13 crc kubenswrapper[4698]: I0224 10:18:13.433975 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:13Z","lastTransitionTime":"2026-02-24T10:18:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:13 crc kubenswrapper[4698]: I0224 10:18:13.537389 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:13 crc kubenswrapper[4698]: I0224 10:18:13.537476 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:13 crc kubenswrapper[4698]: I0224 10:18:13.537496 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:13 crc kubenswrapper[4698]: I0224 10:18:13.537521 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:13 crc kubenswrapper[4698]: I0224 10:18:13.537539 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:13Z","lastTransitionTime":"2026-02-24T10:18:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:13 crc kubenswrapper[4698]: I0224 10:18:13.614430 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:18:13 crc kubenswrapper[4698]: I0224 10:18:13.614469 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:18:13 crc kubenswrapper[4698]: E0224 10:18:13.614573 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:18:13 crc kubenswrapper[4698]: E0224 10:18:13.614658 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:18:13 crc kubenswrapper[4698]: I0224 10:18:13.640090 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:13 crc kubenswrapper[4698]: I0224 10:18:13.640221 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:13 crc kubenswrapper[4698]: I0224 10:18:13.640254 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:13 crc kubenswrapper[4698]: I0224 10:18:13.640334 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:13 crc kubenswrapper[4698]: I0224 10:18:13.640364 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:13Z","lastTransitionTime":"2026-02-24T10:18:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:13 crc kubenswrapper[4698]: I0224 10:18:13.743205 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:13 crc kubenswrapper[4698]: I0224 10:18:13.743333 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:13 crc kubenswrapper[4698]: I0224 10:18:13.743357 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:13 crc kubenswrapper[4698]: I0224 10:18:13.743386 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:13 crc kubenswrapper[4698]: I0224 10:18:13.743408 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:13Z","lastTransitionTime":"2026-02-24T10:18:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:13 crc kubenswrapper[4698]: I0224 10:18:13.846851 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:13 crc kubenswrapper[4698]: I0224 10:18:13.846936 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:13 crc kubenswrapper[4698]: I0224 10:18:13.846992 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:13 crc kubenswrapper[4698]: I0224 10:18:13.847024 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:13 crc kubenswrapper[4698]: I0224 10:18:13.847046 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:13Z","lastTransitionTime":"2026-02-24T10:18:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:13 crc kubenswrapper[4698]: I0224 10:18:13.955764 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:13 crc kubenswrapper[4698]: I0224 10:18:13.955840 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:13 crc kubenswrapper[4698]: I0224 10:18:13.955866 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:13 crc kubenswrapper[4698]: I0224 10:18:13.955912 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:13 crc kubenswrapper[4698]: I0224 10:18:13.955942 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:13Z","lastTransitionTime":"2026-02-24T10:18:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:14 crc kubenswrapper[4698]: I0224 10:18:14.060129 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:14 crc kubenswrapper[4698]: I0224 10:18:14.060218 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:14 crc kubenswrapper[4698]: I0224 10:18:14.060239 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:14 crc kubenswrapper[4698]: I0224 10:18:14.060301 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:14 crc kubenswrapper[4698]: I0224 10:18:14.060320 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:14Z","lastTransitionTime":"2026-02-24T10:18:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:14 crc kubenswrapper[4698]: I0224 10:18:14.163512 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:14 crc kubenswrapper[4698]: I0224 10:18:14.163588 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:14 crc kubenswrapper[4698]: I0224 10:18:14.163609 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:14 crc kubenswrapper[4698]: I0224 10:18:14.163633 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:14 crc kubenswrapper[4698]: I0224 10:18:14.163675 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:14Z","lastTransitionTime":"2026-02-24T10:18:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:14 crc kubenswrapper[4698]: I0224 10:18:14.266415 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:14 crc kubenswrapper[4698]: I0224 10:18:14.266481 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:14 crc kubenswrapper[4698]: I0224 10:18:14.266502 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:14 crc kubenswrapper[4698]: I0224 10:18:14.266536 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:14 crc kubenswrapper[4698]: I0224 10:18:14.266562 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:14Z","lastTransitionTime":"2026-02-24T10:18:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:14 crc kubenswrapper[4698]: I0224 10:18:14.369802 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:14 crc kubenswrapper[4698]: I0224 10:18:14.369874 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:14 crc kubenswrapper[4698]: I0224 10:18:14.369891 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:14 crc kubenswrapper[4698]: I0224 10:18:14.369914 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:14 crc kubenswrapper[4698]: I0224 10:18:14.369931 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:14Z","lastTransitionTime":"2026-02-24T10:18:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:14 crc kubenswrapper[4698]: I0224 10:18:14.472438 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:14 crc kubenswrapper[4698]: I0224 10:18:14.472513 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:14 crc kubenswrapper[4698]: I0224 10:18:14.472538 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:14 crc kubenswrapper[4698]: I0224 10:18:14.472572 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:14 crc kubenswrapper[4698]: I0224 10:18:14.472595 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:14Z","lastTransitionTime":"2026-02-24T10:18:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:14 crc kubenswrapper[4698]: I0224 10:18:14.575378 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:14 crc kubenswrapper[4698]: I0224 10:18:14.575472 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:14 crc kubenswrapper[4698]: I0224 10:18:14.575489 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:14 crc kubenswrapper[4698]: I0224 10:18:14.575569 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:14 crc kubenswrapper[4698]: I0224 10:18:14.575595 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:14Z","lastTransitionTime":"2026-02-24T10:18:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:14 crc kubenswrapper[4698]: I0224 10:18:14.613676 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpnnm" Feb 24 10:18:14 crc kubenswrapper[4698]: I0224 10:18:14.613704 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:18:14 crc kubenswrapper[4698]: E0224 10:18:14.613866 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpnnm" podUID="17a1338b-6385-4795-9397-74316d6599d9" Feb 24 10:18:14 crc kubenswrapper[4698]: E0224 10:18:14.613996 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:18:14 crc kubenswrapper[4698]: I0224 10:18:14.679378 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:14 crc kubenswrapper[4698]: I0224 10:18:14.679457 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:14 crc kubenswrapper[4698]: I0224 10:18:14.679479 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:14 crc kubenswrapper[4698]: I0224 10:18:14.679510 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:14 crc kubenswrapper[4698]: I0224 10:18:14.679531 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:14Z","lastTransitionTime":"2026-02-24T10:18:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:14 crc kubenswrapper[4698]: I0224 10:18:14.782572 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:14 crc kubenswrapper[4698]: I0224 10:18:14.782942 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:14 crc kubenswrapper[4698]: I0224 10:18:14.783091 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:14 crc kubenswrapper[4698]: I0224 10:18:14.783228 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:14 crc kubenswrapper[4698]: I0224 10:18:14.783418 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:14Z","lastTransitionTime":"2026-02-24T10:18:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:14 crc kubenswrapper[4698]: I0224 10:18:14.886875 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:14 crc kubenswrapper[4698]: I0224 10:18:14.886930 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:14 crc kubenswrapper[4698]: I0224 10:18:14.886954 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:14 crc kubenswrapper[4698]: I0224 10:18:14.886981 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:14 crc kubenswrapper[4698]: I0224 10:18:14.887005 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:14Z","lastTransitionTime":"2026-02-24T10:18:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:14 crc kubenswrapper[4698]: I0224 10:18:14.990224 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:14 crc kubenswrapper[4698]: I0224 10:18:14.990353 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:14 crc kubenswrapper[4698]: I0224 10:18:14.990380 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:14 crc kubenswrapper[4698]: I0224 10:18:14.990414 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:14 crc kubenswrapper[4698]: I0224 10:18:14.990437 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:14Z","lastTransitionTime":"2026-02-24T10:18:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:15 crc kubenswrapper[4698]: I0224 10:18:15.094164 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:15 crc kubenswrapper[4698]: I0224 10:18:15.094483 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:15 crc kubenswrapper[4698]: I0224 10:18:15.094505 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:15 crc kubenswrapper[4698]: I0224 10:18:15.094530 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:15 crc kubenswrapper[4698]: I0224 10:18:15.094550 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:15Z","lastTransitionTime":"2026-02-24T10:18:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:15 crc kubenswrapper[4698]: I0224 10:18:15.197344 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:15 crc kubenswrapper[4698]: I0224 10:18:15.197392 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:15 crc kubenswrapper[4698]: I0224 10:18:15.197409 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:15 crc kubenswrapper[4698]: I0224 10:18:15.197432 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:15 crc kubenswrapper[4698]: I0224 10:18:15.197465 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:15Z","lastTransitionTime":"2026-02-24T10:18:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:15 crc kubenswrapper[4698]: I0224 10:18:15.300226 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:15 crc kubenswrapper[4698]: I0224 10:18:15.300594 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:15 crc kubenswrapper[4698]: I0224 10:18:15.300789 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:15 crc kubenswrapper[4698]: I0224 10:18:15.300989 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:15 crc kubenswrapper[4698]: I0224 10:18:15.301175 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:15Z","lastTransitionTime":"2026-02-24T10:18:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:15 crc kubenswrapper[4698]: I0224 10:18:15.403772 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:15 crc kubenswrapper[4698]: I0224 10:18:15.403839 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:15 crc kubenswrapper[4698]: I0224 10:18:15.403861 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:15 crc kubenswrapper[4698]: I0224 10:18:15.403888 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:15 crc kubenswrapper[4698]: I0224 10:18:15.403912 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:15Z","lastTransitionTime":"2026-02-24T10:18:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:15 crc kubenswrapper[4698]: I0224 10:18:15.506757 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:15 crc kubenswrapper[4698]: I0224 10:18:15.506804 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:15 crc kubenswrapper[4698]: I0224 10:18:15.506820 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:15 crc kubenswrapper[4698]: I0224 10:18:15.506842 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:15 crc kubenswrapper[4698]: I0224 10:18:15.506858 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:15Z","lastTransitionTime":"2026-02-24T10:18:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:15 crc kubenswrapper[4698]: E0224 10:18:15.607224 4698 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Feb 24 10:18:15 crc kubenswrapper[4698]: I0224 10:18:15.613923 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:18:15 crc kubenswrapper[4698]: I0224 10:18:15.613926 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:18:15 crc kubenswrapper[4698]: E0224 10:18:15.614060 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:18:15 crc kubenswrapper[4698]: E0224 10:18:15.614401 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:18:15 crc kubenswrapper[4698]: I0224 10:18:15.637131 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b70223850a461f607af8055fb157db676ed4dd9537481c41f21b8b85dc955c6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:15Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:15 crc kubenswrapper[4698]: I0224 10:18:15.658623 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e70623bb6b1c9ba54ae662592cd2861cea4181853f6595a595390c81820c287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://863cee3a2b2acf3e3138d4e13d27a2b4229d619661f97eab920e5a4ee7ae2c51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:15Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:15 crc kubenswrapper[4698]: I0224 10:18:15.673648 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:15Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:15 crc kubenswrapper[4698]: I0224 10:18:15.697013 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jlg97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90062989-bf1b-4479-89a0-f3bf0d438ac3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd966a1dd77be4accb00f38133ee9df9a0f98df5050d51996c9547a95c361cfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f570e20898252544de2e4987e3ec3baea2d46904749fc01664c969518d8babd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f570e20898252544de2e4987e3ec3baea2d46904749fc01664c969518d8babd6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86844171c4cdeecffa4831f9bba9b6d9c5eecbcc2220f880ccdb8819df60fa34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86844171c4cdeecffa4831f9bba9b6d9c5eecbcc2220f880ccdb8819df60fa34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42705a048e7832b1de855a97691620e572a7a7f38b90148e1cedd49003c649fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42705a048e7832b1de855a97691620e572a7a7f38b90148e1cedd49003c649fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5968e3b94b9d8996e9c4d4fdfab0576fcee049356dff5defd85f1a71ab652c41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5968e3b94b9d8996e9c4d4fdfab0576fcee049356dff5defd85f1a71ab652c41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05bd18aaa2469fc7380f98a513907e098a1cd45c794dae35894dc4caccaaeac8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05bd18aaa2469fc7380f98a513907e098a1cd45c794dae35894dc4caccaaeac8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c47b55214c6082bb9f8a18705983f9be95ef4c3b557d2d8f6cb8a33fa1fddd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c47b55214c6082bb9f8a18705983f9be95ef4c3b557d2d8f6cb8a33fa1fddd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jlg97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:15Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:15 crc kubenswrapper[4698]: I0224 10:18:15.719794 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"066df704-6981-4770-a647-df52a0da50a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60215d9a7dc3fbaa1b045a76c018c910f3748c5bef5325716e0a28844bc91ece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e27ae8c6aa803d58f6ff0252273d2fcbbee794c49a13fc54bfe6677b5aa6e07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7adc5b73bdd01b1e822308534c8848e154a1d05ed5367b971b59a99289387585\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2ec337c851d86c491d1ae5a667e4344ae4759f945b423d3a48838874a6eda20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://444da705b890c795bca82d2bd44ad5b71ed9bcc95a70ee5c92755679af31aa99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://096010abeb5f4fc1cf8ab2a1a3e50000365a449d0747081df923bde1be7e1213\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c4183b7a2d42eded3a4a62df9ef06d127a9a288a8e51010277cb370cf9d019e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c4183b7a2d42eded3a4a62df9ef06d127a9a288a8e51010277cb370cf9d019e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-24T10:18:05Z\\\",\\\"message\\\":\\\"5 6551 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-controller-manager/kube-controller-manager\\\\\\\"}\\\\nI0224 10:18:04.755926 6551 services_controller.go:360] Finished syncing service kube-controller-manager on namespace openshift-kube-controller-manager for network=default : 3.573648ms\\\\nI0224 10:18:04.756185 6551 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-apiserver/apiserver\\\\\\\"}\\\\nI0224 10:18:04.756205 6551 services_controller.go:360] Finished syncing service apiserver on namespace openshift-kube-apiserver for network=default : 3.903445ms\\\\nI0224 10:18:04.756473 6551 obj_retry.go:551] Creating *factory.egressNode crc took: 8.810484ms\\\\nI0224 10:18:04.756500 6551 factory.go:1336] Added *v1.Node event handler 7\\\\nI0224 10:18:04.756534 6551 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0224 10:18:04.756802 6551 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0224 10:18:04.756875 6551 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0224 10:18:04.756903 6551 ovnkube.go:599] Stopped ovnkube\\\\nI0224 10:18:04.756935 6551 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0224 10:18:04.756995 6551 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:18:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-mgh7p_openshift-ovn-kubernetes(066df704-6981-4770-a647-df52a0da50a0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1288272246b8937c2880153451d797fc3328749902e2491e60c8f8f086c85288\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://363eade2263b2108feaaf0620f7f1fd910effb90ce635e5b749b59b407618443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://363eade2263b2108feaaf0620f7f1fd910effb90ce635e5b749b59b407618443\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mgh7p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:15Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:15 crc kubenswrapper[4698]: I0224 10:18:15.734580 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mb4d7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc3c474c-e869-4b47-94c5-f1ab3ce3c843\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d49238acba0219497644e528a1e99906b8e7e5d4a61033354fa8b7b9708b5e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d8kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mb4d7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:15Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:15 crc kubenswrapper[4698]: E0224 10:18:15.736011 4698 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 24 10:18:15 crc kubenswrapper[4698]: I0224 10:18:15.753104 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bhrhk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d24b42-65c5-4a01-8f4a-6f970714ab76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93b33a3866385dfb6006f052ecde4b52df1dad342d6392f0935f548b610c26e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-knwn8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://892da9f80566a48c6ace1fb4d7a16d824aad789a4ae631728a01c22a8d7b04f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-knwn8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:18:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bhrhk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:15Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:15 crc kubenswrapper[4698]: I0224 10:18:15.777488 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34fd32d5-5aed-4abb-bf14-ab1b8b02b516\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b9d9ca2f4ccd094b55e3e27cef8afddae5dc7de81912aba64ca6a6671f14a35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42a2655047e1fb057b615781d8c2ccf50f62f2a70749ef8bb214d32edaba2b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e1bb75600de7e41c8a04ba010078c753b55d05aae7a18f945c2027ba48ee30c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa3d4a95fd60ff55d1850deb923135ed607172e7676a141a5d52e6cdd60b23bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64b39341e105fbe8aa9dc4c108f6ee8a2bff33568a205e32e639b8382ab2ccb2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T10:17:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0224 10:17:08.346350 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0224 10:17:08.346447 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0224 10:17:08.346900 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705878618/tls.crt::/tmp/serving-cert-3705878618/tls.key\\\\\\\"\\\\nI0224 10:17:08.624012 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0224 10:17:08.625525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0224 10:17:08.625540 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0224 10:17:08.625560 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0224 10:17:08.625565 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0224 10:17:08.629654 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0224 10:17:08.629666 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0224 10:17:08.629711 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:17:08.629725 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:17:08.629739 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0224 10:17:08.629749 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0224 10:17:08.629758 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0224 10:17:08.629766 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0224 10:17:08.630467 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://674ed085a7507742c61fdb7dae4678b08e315a3679788c5dcbb4df97cdc27c61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9e1b116db9c76dec99d1ac4af98e5ee081f2a171a19093ba5628b676356f34b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9e1b116db9c76dec99d1ac4af98e5ee081f2a171a19093ba5628b676356f34b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:16:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:16:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:16:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:15Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:15 crc kubenswrapper[4698]: I0224 10:18:15.792881 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-29rvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9cba56db-d46e-4a34-9863-47e4dce27ca5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f62a06c2933f02c75637172be87adadd015a2aad2750f553bb2e99c38fbec74b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fk9xv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-29rvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:15Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:15 crc kubenswrapper[4698]: I0224 10:18:15.808991 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:15Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:15 crc kubenswrapper[4698]: I0224 10:18:15.826947 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:15Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:15 crc kubenswrapper[4698]: I0224 10:18:15.847109 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4539d49e9935099b59be97e672ffbe6a2a831b9261939a5afba45e16aab5c2dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:15Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:15 crc kubenswrapper[4698]: I0224 10:18:15.865205 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-nn578" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4ee0bb1-125d-4852-a54d-7dadf6177545\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67e08c23594b195088f0a11823556880d9f809097ec231acf6c4ddbcf5c085b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9ngd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0c8bc2bc5ebfb2472863808bf33f95f8aa74ed45b546ed1a1b3be4883af700e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9ngd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-nn578\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:15Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:15 crc kubenswrapper[4698]: I0224 10:18:15.879500 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7mbk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17dd9ce8-b1ca-4810-85fe-9775919eb4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac059400b5a17e1f1dc36d2fe35b5c8ace2dad5326f3933873eae644e1786c54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgnjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7mbk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:15Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:15 crc kubenswrapper[4698]: I0224 10:18:15.892996 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rpnnm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17a1338b-6385-4795-9397-74316d6599d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7xll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7xll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:18:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rpnnm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:15Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:16 crc kubenswrapper[4698]: I0224 10:18:16.614683 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:18:16 crc kubenswrapper[4698]: I0224 10:18:16.614764 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpnnm" Feb 24 10:18:16 crc kubenswrapper[4698]: E0224 10:18:16.614869 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:18:16 crc kubenswrapper[4698]: E0224 10:18:16.615090 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpnnm" podUID="17a1338b-6385-4795-9397-74316d6599d9" Feb 24 10:18:16 crc kubenswrapper[4698]: I0224 10:18:16.630202 4698 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Feb 24 10:18:17 crc kubenswrapper[4698]: I0224 10:18:17.614129 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:18:17 crc kubenswrapper[4698]: I0224 10:18:17.614190 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:18:17 crc kubenswrapper[4698]: E0224 10:18:17.614368 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:18:17 crc kubenswrapper[4698]: E0224 10:18:17.614521 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:18:17 crc kubenswrapper[4698]: I0224 10:18:17.878732 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:17 crc kubenswrapper[4698]: I0224 10:18:17.878798 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:17 crc kubenswrapper[4698]: I0224 10:18:17.878819 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:17 crc kubenswrapper[4698]: I0224 10:18:17.878844 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:17 crc kubenswrapper[4698]: I0224 10:18:17.878861 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:17Z","lastTransitionTime":"2026-02-24T10:18:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:17 crc kubenswrapper[4698]: E0224 10:18:17.901377 4698 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b118f46-32f0-479c-9931-37b2bbb76922\\\",\\\"systemUUID\\\":\\\"b9d2441b-c8c3-476a-9c48-bba682d9b98e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:17Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:17 crc kubenswrapper[4698]: I0224 10:18:17.908033 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:17 crc kubenswrapper[4698]: I0224 10:18:17.908102 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:17 crc kubenswrapper[4698]: I0224 10:18:17.908122 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:17 crc kubenswrapper[4698]: I0224 10:18:17.908148 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:17 crc kubenswrapper[4698]: I0224 10:18:17.908165 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:17Z","lastTransitionTime":"2026-02-24T10:18:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:17 crc kubenswrapper[4698]: E0224 10:18:17.928850 4698 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b118f46-32f0-479c-9931-37b2bbb76922\\\",\\\"systemUUID\\\":\\\"b9d2441b-c8c3-476a-9c48-bba682d9b98e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:17Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:17 crc kubenswrapper[4698]: I0224 10:18:17.933609 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:17 crc kubenswrapper[4698]: I0224 10:18:17.933666 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:17 crc kubenswrapper[4698]: I0224 10:18:17.933684 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:17 crc kubenswrapper[4698]: I0224 10:18:17.933709 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:17 crc kubenswrapper[4698]: I0224 10:18:17.933726 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:17Z","lastTransitionTime":"2026-02-24T10:18:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:17 crc kubenswrapper[4698]: E0224 10:18:17.952444 4698 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b118f46-32f0-479c-9931-37b2bbb76922\\\",\\\"systemUUID\\\":\\\"b9d2441b-c8c3-476a-9c48-bba682d9b98e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:17Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:17 crc kubenswrapper[4698]: I0224 10:18:17.957810 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:17 crc kubenswrapper[4698]: I0224 10:18:17.957844 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:17 crc kubenswrapper[4698]: I0224 10:18:17.957856 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:17 crc kubenswrapper[4698]: I0224 10:18:17.957875 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:17 crc kubenswrapper[4698]: I0224 10:18:17.957905 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:17Z","lastTransitionTime":"2026-02-24T10:18:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:17 crc kubenswrapper[4698]: E0224 10:18:17.971770 4698 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b118f46-32f0-479c-9931-37b2bbb76922\\\",\\\"systemUUID\\\":\\\"b9d2441b-c8c3-476a-9c48-bba682d9b98e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:17Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:17 crc kubenswrapper[4698]: I0224 10:18:17.976104 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:17 crc kubenswrapper[4698]: I0224 10:18:17.976190 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:17 crc kubenswrapper[4698]: I0224 10:18:17.976216 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:17 crc kubenswrapper[4698]: I0224 10:18:17.976250 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:17 crc kubenswrapper[4698]: I0224 10:18:17.976307 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:17Z","lastTransitionTime":"2026-02-24T10:18:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:17 crc kubenswrapper[4698]: E0224 10:18:17.997897 4698 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:17Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b118f46-32f0-479c-9931-37b2bbb76922\\\",\\\"systemUUID\\\":\\\"b9d2441b-c8c3-476a-9c48-bba682d9b98e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:17Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:17 crc kubenswrapper[4698]: E0224 10:18:17.998287 4698 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 24 10:18:18 crc kubenswrapper[4698]: I0224 10:18:18.580200 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 10:18:18 crc kubenswrapper[4698]: I0224 10:18:18.605054 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34fd32d5-5aed-4abb-bf14-ab1b8b02b516\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b9d9ca2f4ccd094b55e3e27cef8afddae5dc7de81912aba64ca6a6671f14a35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42a2655047e1fb057b615781d8c2ccf50f62f2a70749ef8bb214d32edaba2b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e1bb75600de7e41c8a04ba010078c753b55d05aae7a18f945c2027ba48ee30c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa3d4a95fd60ff55d1850deb923135ed607172e7676a141a5d52e6cdd60b23bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64b39341e105fbe8aa9dc4c108f6ee8a2bff33568a205e32e639b8382ab2ccb2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T10:17:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0224 10:17:08.346350 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0224 10:17:08.346447 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0224 10:17:08.346900 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705878618/tls.crt::/tmp/serving-cert-3705878618/tls.key\\\\\\\"\\\\nI0224 10:17:08.624012 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0224 10:17:08.625525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0224 10:17:08.625540 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0224 10:17:08.625560 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0224 10:17:08.625565 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0224 10:17:08.629654 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0224 10:17:08.629666 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0224 10:17:08.629711 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:17:08.629725 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:17:08.629739 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0224 10:17:08.629749 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0224 10:17:08.629758 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0224 10:17:08.629766 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0224 10:17:08.630467 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://674ed085a7507742c61fdb7dae4678b08e315a3679788c5dcbb4df97cdc27c61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9e1b116db9c76dec99d1ac4af98e5ee081f2a171a19093ba5628b676356f34b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9e1b116db9c76dec99d1ac4af98e5ee081f2a171a19093ba5628b676356f34b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:16:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:16:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:16:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:18Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:18 crc kubenswrapper[4698]: I0224 10:18:18.614247 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:18:18 crc kubenswrapper[4698]: I0224 10:18:18.614315 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpnnm" Feb 24 10:18:18 crc kubenswrapper[4698]: E0224 10:18:18.614529 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:18:18 crc kubenswrapper[4698]: E0224 10:18:18.614648 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpnnm" podUID="17a1338b-6385-4795-9397-74316d6599d9" Feb 24 10:18:18 crc kubenswrapper[4698]: I0224 10:18:18.623942 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-29rvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9cba56db-d46e-4a34-9863-47e4dce27ca5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f62a06c2933f02c75637172be87adadd015a2aad2750f553bb2e99c38fbec74b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fk9xv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-29rvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:18Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:18 crc kubenswrapper[4698]: I0224 10:18:18.646667 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff15454f-f3f9-4740-ba7f-141fc467f2bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23d201f106cf9fbd3bd2821755ea1fd87709b24155eebfab4f687defd0fd60bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://686acda68f64175c520efc4054df6bcfd32b2c98a3d8134d32e252d265520338\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T10:16:46Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0224 10:16:17.940332 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0224 10:16:17.942929 1 observer_polling.go:159] Starting file observer\\\\nI0224 10:16:18.009451 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0224 10:16:18.013067 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0224 10:16:46.691402 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0224 10:16:46.691560 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:16:17Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0a9191217045254bf454800fc32d325cc4450d0d4d0d9b6fb4bd6a438872cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9ee3d8391b55fa37cff72ad555ec89f4b12b8b5ef765979d929da0ae7cbb052\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b45bc6035a33d5e9841bd5791aeb2521dd1f93616396be15bef77dc6f5af97cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:16:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:18Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:18 crc kubenswrapper[4698]: I0224 10:18:18.666155 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:18Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:18 crc kubenswrapper[4698]: I0224 10:18:18.685943 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:18Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:18 crc kubenswrapper[4698]: I0224 10:18:18.706125 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4539d49e9935099b59be97e672ffbe6a2a831b9261939a5afba45e16aab5c2dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:18Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:18 crc kubenswrapper[4698]: I0224 10:18:18.723998 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-nn578" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4ee0bb1-125d-4852-a54d-7dadf6177545\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67e08c23594b195088f0a11823556880d9f809097ec231acf6c4ddbcf5c085b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9ngd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0c8bc2bc5ebfb2472863808bf33f95f8aa74ed45b546ed1a1b3be4883af700e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9ngd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-nn578\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:18Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:18 crc kubenswrapper[4698]: I0224 10:18:18.745153 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7mbk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17dd9ce8-b1ca-4810-85fe-9775919eb4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac059400b5a17e1f1dc36d2fe35b5c8ace2dad5326f3933873eae644e1786c54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgnjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7mbk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:18Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:18 crc kubenswrapper[4698]: I0224 10:18:18.764768 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rpnnm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17a1338b-6385-4795-9397-74316d6599d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7xll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7xll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:18:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rpnnm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:18Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:18 crc kubenswrapper[4698]: I0224 10:18:18.780818 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mb4d7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc3c474c-e869-4b47-94c5-f1ab3ce3c843\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d49238acba0219497644e528a1e99906b8e7e5d4a61033354fa8b7b9708b5e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d8kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mb4d7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:18Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:18 crc kubenswrapper[4698]: I0224 10:18:18.800211 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b70223850a461f607af8055fb157db676ed4dd9537481c41f21b8b85dc955c6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:18Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:18 crc kubenswrapper[4698]: I0224 10:18:18.820340 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e70623bb6b1c9ba54ae662592cd2861cea4181853f6595a595390c81820c287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://863cee3a2b2acf3e3138d4e13d27a2b4229d619661f97eab920e5a4ee7ae2c51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:18Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:18 crc kubenswrapper[4698]: I0224 10:18:18.839249 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:18Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:18 crc kubenswrapper[4698]: I0224 10:18:18.861850 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jlg97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90062989-bf1b-4479-89a0-f3bf0d438ac3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd966a1dd77be4accb00f38133ee9df9a0f98df5050d51996c9547a95c361cfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f570e20898252544de2e4987e3ec3baea2d46904749fc01664c969518d8babd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f570e20898252544de2e4987e3ec3baea2d46904749fc01664c969518d8babd6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86844171c4cdeecffa4831f9bba9b6d9c5eecbcc2220f880ccdb8819df60fa34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86844171c4cdeecffa4831f9bba9b6d9c5eecbcc2220f880ccdb8819df60fa34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42705a048e7832b1de855a97691620e572a7a7f38b90148e1cedd49003c649fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42705a048e7832b1de855a97691620e572a7a7f38b90148e1cedd49003c649fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5968e3b94b9d8996e9c4d4fdfab0576fcee049356dff5defd85f1a71ab652c41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5968e3b94b9d8996e9c4d4fdfab0576fcee049356dff5defd85f1a71ab652c41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05bd18aaa2469fc7380f98a513907e098a1cd45c794dae35894dc4caccaaeac8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05bd18aaa2469fc7380f98a513907e098a1cd45c794dae35894dc4caccaaeac8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c47b55214c6082bb9f8a18705983f9be95ef4c3b557d2d8f6cb8a33fa1fddd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c47b55214c6082bb9f8a18705983f9be95ef4c3b557d2d8f6cb8a33fa1fddd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jlg97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:18Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:18 crc kubenswrapper[4698]: I0224 10:18:18.890728 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"066df704-6981-4770-a647-df52a0da50a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60215d9a7dc3fbaa1b045a76c018c910f3748c5bef5325716e0a28844bc91ece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e27ae8c6aa803d58f6ff0252273d2fcbbee794c49a13fc54bfe6677b5aa6e07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7adc5b73bdd01b1e822308534c8848e154a1d05ed5367b971b59a99289387585\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2ec337c851d86c491d1ae5a667e4344ae4759f945b423d3a48838874a6eda20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://444da705b890c795bca82d2bd44ad5b71ed9bcc95a70ee5c92755679af31aa99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://096010abeb5f4fc1cf8ab2a1a3e50000365a449d0747081df923bde1be7e1213\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4c4183b7a2d42eded3a4a62df9ef06d127a9a288a8e51010277cb370cf9d019e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c4183b7a2d42eded3a4a62df9ef06d127a9a288a8e51010277cb370cf9d019e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-24T10:18:05Z\\\",\\\"message\\\":\\\"5 6551 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-controller-manager/kube-controller-manager\\\\\\\"}\\\\nI0224 10:18:04.755926 6551 services_controller.go:360] Finished syncing service kube-controller-manager on namespace openshift-kube-controller-manager for network=default : 3.573648ms\\\\nI0224 10:18:04.756185 6551 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-apiserver/apiserver\\\\\\\"}\\\\nI0224 10:18:04.756205 6551 services_controller.go:360] Finished syncing service apiserver on namespace openshift-kube-apiserver for network=default : 3.903445ms\\\\nI0224 10:18:04.756473 6551 obj_retry.go:551] Creating *factory.egressNode crc took: 8.810484ms\\\\nI0224 10:18:04.756500 6551 factory.go:1336] Added *v1.Node event handler 7\\\\nI0224 10:18:04.756534 6551 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0224 10:18:04.756802 6551 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0224 10:18:04.756875 6551 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0224 10:18:04.756903 6551 ovnkube.go:599] Stopped ovnkube\\\\nI0224 10:18:04.756935 6551 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0224 10:18:04.756995 6551 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:18:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-mgh7p_openshift-ovn-kubernetes(066df704-6981-4770-a647-df52a0da50a0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1288272246b8937c2880153451d797fc3328749902e2491e60c8f8f086c85288\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://363eade2263b2108feaaf0620f7f1fd910effb90ce635e5b749b59b407618443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://363eade2263b2108feaaf0620f7f1fd910effb90ce635e5b749b59b407618443\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mgh7p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:18Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:18 crc kubenswrapper[4698]: I0224 10:18:18.905537 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bhrhk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d24b42-65c5-4a01-8f4a-6f970714ab76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93b33a3866385dfb6006f052ecde4b52df1dad342d6392f0935f548b610c26e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-knwn8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://892da9f80566a48c6ace1fb4d7a16d824aad789a4ae631728a01c22a8d7b04f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-knwn8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:18:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bhrhk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:18Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:19 crc kubenswrapper[4698]: I0224 10:18:19.613848 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:18:19 crc kubenswrapper[4698]: E0224 10:18:19.613993 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:18:19 crc kubenswrapper[4698]: I0224 10:18:19.613852 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:18:19 crc kubenswrapper[4698]: E0224 10:18:19.614222 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:18:20 crc kubenswrapper[4698]: I0224 10:18:20.614743 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpnnm" Feb 24 10:18:20 crc kubenswrapper[4698]: I0224 10:18:20.614780 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:18:20 crc kubenswrapper[4698]: E0224 10:18:20.614946 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpnnm" podUID="17a1338b-6385-4795-9397-74316d6599d9" Feb 24 10:18:20 crc kubenswrapper[4698]: E0224 10:18:20.615139 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:18:20 crc kubenswrapper[4698]: I0224 10:18:20.617754 4698 scope.go:117] "RemoveContainer" containerID="4c4183b7a2d42eded3a4a62df9ef06d127a9a288a8e51010277cb370cf9d019e" Feb 24 10:18:20 crc kubenswrapper[4698]: E0224 10:18:20.738079 4698 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 24 10:18:21 crc kubenswrapper[4698]: I0224 10:18:21.246505 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/17a1338b-6385-4795-9397-74316d6599d9-metrics-certs\") pod \"network-metrics-daemon-rpnnm\" (UID: \"17a1338b-6385-4795-9397-74316d6599d9\") " pod="openshift-multus/network-metrics-daemon-rpnnm" Feb 24 10:18:21 crc kubenswrapper[4698]: E0224 10:18:21.246749 4698 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 24 10:18:21 crc kubenswrapper[4698]: E0224 10:18:21.246880 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/17a1338b-6385-4795-9397-74316d6599d9-metrics-certs podName:17a1338b-6385-4795-9397-74316d6599d9 nodeName:}" failed. No retries permitted until 2026-02-24 10:18:37.246835137 +0000 UTC m=+142.360449418 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/17a1338b-6385-4795-9397-74316d6599d9-metrics-certs") pod "network-metrics-daemon-rpnnm" (UID: "17a1338b-6385-4795-9397-74316d6599d9") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 24 10:18:21 crc kubenswrapper[4698]: I0224 10:18:21.274834 4698 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mgh7p_066df704-6981-4770-a647-df52a0da50a0/ovnkube-controller/1.log" Feb 24 10:18:21 crc kubenswrapper[4698]: I0224 10:18:21.279481 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" event={"ID":"066df704-6981-4770-a647-df52a0da50a0","Type":"ContainerStarted","Data":"0fd1af0c59642907aa55721d60c59e0870d5597da7bdf99f8248f852ea5e393c"} Feb 24 10:18:21 crc kubenswrapper[4698]: I0224 10:18:21.279930 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" Feb 24 10:18:21 crc kubenswrapper[4698]: I0224 10:18:21.297720 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:21Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:21 crc kubenswrapper[4698]: I0224 10:18:21.317617 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4539d49e9935099b59be97e672ffbe6a2a831b9261939a5afba45e16aab5c2dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:21Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:21 crc kubenswrapper[4698]: I0224 10:18:21.333070 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-nn578" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4ee0bb1-125d-4852-a54d-7dadf6177545\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67e08c23594b195088f0a11823556880d9f809097ec231acf6c4ddbcf5c085b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9ngd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0c8bc2bc5ebfb2472863808bf33f95f8aa74ed45b546ed1a1b3be4883af700e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9ngd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-nn578\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:21Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:21 crc kubenswrapper[4698]: I0224 10:18:21.352581 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7mbk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17dd9ce8-b1ca-4810-85fe-9775919eb4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac059400b5a17e1f1dc36d2fe35b5c8ace2dad5326f3933873eae644e1786c54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgnjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7mbk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:21Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:21 crc kubenswrapper[4698]: I0224 10:18:21.363430 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rpnnm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17a1338b-6385-4795-9397-74316d6599d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7xll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7xll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:18:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rpnnm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:21Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:21 crc kubenswrapper[4698]: I0224 10:18:21.386953 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"066df704-6981-4770-a647-df52a0da50a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60215d9a7dc3fbaa1b045a76c018c910f3748c5bef5325716e0a28844bc91ece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e27ae8c6aa803d58f6ff0252273d2fcbbee794c49a13fc54bfe6677b5aa6e07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7adc5b73bdd01b1e822308534c8848e154a1d05ed5367b971b59a99289387585\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2ec337c851d86c491d1ae5a667e4344ae4759f945b423d3a48838874a6eda20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://444da705b890c795bca82d2bd44ad5b71ed9bcc95a70ee5c92755679af31aa99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://096010abeb5f4fc1cf8ab2a1a3e50000365a449d0747081df923bde1be7e1213\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd1af0c59642907aa55721d60c59e0870d5597da7bdf99f8248f852ea5e393c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c4183b7a2d42eded3a4a62df9ef06d127a9a288a8e51010277cb370cf9d019e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-24T10:18:05Z\\\",\\\"message\\\":\\\"5 6551 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-controller-manager/kube-controller-manager\\\\\\\"}\\\\nI0224 10:18:04.755926 6551 services_controller.go:360] Finished syncing service kube-controller-manager on namespace openshift-kube-controller-manager for network=default : 3.573648ms\\\\nI0224 10:18:04.756185 6551 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-apiserver/apiserver\\\\\\\"}\\\\nI0224 10:18:04.756205 6551 services_controller.go:360] Finished syncing service apiserver on namespace openshift-kube-apiserver for network=default : 3.903445ms\\\\nI0224 10:18:04.756473 6551 obj_retry.go:551] Creating *factory.egressNode crc took: 8.810484ms\\\\nI0224 10:18:04.756500 6551 factory.go:1336] Added *v1.Node event handler 7\\\\nI0224 10:18:04.756534 6551 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0224 10:18:04.756802 6551 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0224 10:18:04.756875 6551 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0224 10:18:04.756903 6551 ovnkube.go:599] Stopped ovnkube\\\\nI0224 10:18:04.756935 6551 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0224 10:18:04.756995 6551 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:18:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1288272246b8937c2880153451d797fc3328749902e2491e60c8f8f086c85288\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://363eade2263b2108feaaf0620f7f1fd910effb90ce635e5b749b59b407618443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://363eade2263b2108feaaf0620f7f1fd910effb90ce635e5b749b59b407618443\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mgh7p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:21Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:21 crc kubenswrapper[4698]: I0224 10:18:21.399628 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mb4d7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc3c474c-e869-4b47-94c5-f1ab3ce3c843\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d49238acba0219497644e528a1e99906b8e7e5d4a61033354fa8b7b9708b5e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d8kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mb4d7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:21Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:21 crc kubenswrapper[4698]: I0224 10:18:21.412151 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b70223850a461f607af8055fb157db676ed4dd9537481c41f21b8b85dc955c6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:21Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:21 crc kubenswrapper[4698]: I0224 10:18:21.426199 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e70623bb6b1c9ba54ae662592cd2861cea4181853f6595a595390c81820c287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://863cee3a2b2acf3e3138d4e13d27a2b4229d619661f97eab920e5a4ee7ae2c51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:21Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:21 crc kubenswrapper[4698]: I0224 10:18:21.443237 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:21Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:21 crc kubenswrapper[4698]: I0224 10:18:21.463525 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jlg97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90062989-bf1b-4479-89a0-f3bf0d438ac3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd966a1dd77be4accb00f38133ee9df9a0f98df5050d51996c9547a95c361cfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f570e20898252544de2e4987e3ec3baea2d46904749fc01664c969518d8babd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f570e20898252544de2e4987e3ec3baea2d46904749fc01664c969518d8babd6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86844171c4cdeecffa4831f9bba9b6d9c5eecbcc2220f880ccdb8819df60fa34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86844171c4cdeecffa4831f9bba9b6d9c5eecbcc2220f880ccdb8819df60fa34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42705a048e7832b1de855a97691620e572a7a7f38b90148e1cedd49003c649fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42705a048e7832b1de855a97691620e572a7a7f38b90148e1cedd49003c649fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5968e3b94b9d8996e9c4d4fdfab0576fcee049356dff5defd85f1a71ab652c41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5968e3b94b9d8996e9c4d4fdfab0576fcee049356dff5defd85f1a71ab652c41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05bd18aaa2469fc7380f98a513907e098a1cd45c794dae35894dc4caccaaeac8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05bd18aaa2469fc7380f98a513907e098a1cd45c794dae35894dc4caccaaeac8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c47b55214c6082bb9f8a18705983f9be95ef4c3b557d2d8f6cb8a33fa1fddd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c47b55214c6082bb9f8a18705983f9be95ef4c3b557d2d8f6cb8a33fa1fddd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jlg97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:21Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:21 crc kubenswrapper[4698]: I0224 10:18:21.480570 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bhrhk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d24b42-65c5-4a01-8f4a-6f970714ab76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93b33a3866385dfb6006f052ecde4b52df1dad342d6392f0935f548b610c26e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-knwn8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://892da9f80566a48c6ace1fb4d7a16d824aad789a4ae631728a01c22a8d7b04f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-knwn8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:18:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bhrhk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:21Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:21 crc kubenswrapper[4698]: I0224 10:18:21.493288 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34fd32d5-5aed-4abb-bf14-ab1b8b02b516\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b9d9ca2f4ccd094b55e3e27cef8afddae5dc7de81912aba64ca6a6671f14a35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42a2655047e1fb057b615781d8c2ccf50f62f2a70749ef8bb214d32edaba2b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e1bb75600de7e41c8a04ba010078c753b55d05aae7a18f945c2027ba48ee30c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa3d4a95fd60ff55d1850deb923135ed607172e7676a141a5d52e6cdd60b23bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64b39341e105fbe8aa9dc4c108f6ee8a2bff33568a205e32e639b8382ab2ccb2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T10:17:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0224 10:17:08.346350 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0224 10:17:08.346447 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0224 10:17:08.346900 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705878618/tls.crt::/tmp/serving-cert-3705878618/tls.key\\\\\\\"\\\\nI0224 10:17:08.624012 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0224 10:17:08.625525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0224 10:17:08.625540 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0224 10:17:08.625560 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0224 10:17:08.625565 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0224 10:17:08.629654 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0224 10:17:08.629666 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0224 10:17:08.629711 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:17:08.629725 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:17:08.629739 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0224 10:17:08.629749 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0224 10:17:08.629758 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0224 10:17:08.629766 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0224 10:17:08.630467 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://674ed085a7507742c61fdb7dae4678b08e315a3679788c5dcbb4df97cdc27c61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9e1b116db9c76dec99d1ac4af98e5ee081f2a171a19093ba5628b676356f34b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9e1b116db9c76dec99d1ac4af98e5ee081f2a171a19093ba5628b676356f34b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:16:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:16:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:16:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:21Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:21 crc kubenswrapper[4698]: I0224 10:18:21.505354 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-29rvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9cba56db-d46e-4a34-9863-47e4dce27ca5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f62a06c2933f02c75637172be87adadd015a2aad2750f553bb2e99c38fbec74b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fk9xv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-29rvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:21Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:21 crc kubenswrapper[4698]: I0224 10:18:21.517918 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff15454f-f3f9-4740-ba7f-141fc467f2bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23d201f106cf9fbd3bd2821755ea1fd87709b24155eebfab4f687defd0fd60bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://686acda68f64175c520efc4054df6bcfd32b2c98a3d8134d32e252d265520338\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T10:16:46Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0224 10:16:17.940332 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0224 10:16:17.942929 1 observer_polling.go:159] Starting file observer\\\\nI0224 10:16:18.009451 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0224 10:16:18.013067 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0224 10:16:46.691402 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0224 10:16:46.691560 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:16:17Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0a9191217045254bf454800fc32d325cc4450d0d4d0d9b6fb4bd6a438872cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9ee3d8391b55fa37cff72ad555ec89f4b12b8b5ef765979d929da0ae7cbb052\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b45bc6035a33d5e9841bd5791aeb2521dd1f93616396be15bef77dc6f5af97cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:16:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:21Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:21 crc kubenswrapper[4698]: I0224 10:18:21.534693 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:21Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:21 crc kubenswrapper[4698]: I0224 10:18:21.614383 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:18:21 crc kubenswrapper[4698]: I0224 10:18:21.614419 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:18:21 crc kubenswrapper[4698]: E0224 10:18:21.614539 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:18:21 crc kubenswrapper[4698]: E0224 10:18:21.614636 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:18:22 crc kubenswrapper[4698]: I0224 10:18:22.287407 4698 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mgh7p_066df704-6981-4770-a647-df52a0da50a0/ovnkube-controller/2.log" Feb 24 10:18:22 crc kubenswrapper[4698]: I0224 10:18:22.289490 4698 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mgh7p_066df704-6981-4770-a647-df52a0da50a0/ovnkube-controller/1.log" Feb 24 10:18:22 crc kubenswrapper[4698]: I0224 10:18:22.294041 4698 generic.go:334] "Generic (PLEG): container finished" podID="066df704-6981-4770-a647-df52a0da50a0" containerID="0fd1af0c59642907aa55721d60c59e0870d5597da7bdf99f8248f852ea5e393c" exitCode=1 Feb 24 10:18:22 crc kubenswrapper[4698]: I0224 10:18:22.294102 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" event={"ID":"066df704-6981-4770-a647-df52a0da50a0","Type":"ContainerDied","Data":"0fd1af0c59642907aa55721d60c59e0870d5597da7bdf99f8248f852ea5e393c"} Feb 24 10:18:22 crc kubenswrapper[4698]: I0224 10:18:22.294150 4698 scope.go:117] "RemoveContainer" containerID="4c4183b7a2d42eded3a4a62df9ef06d127a9a288a8e51010277cb370cf9d019e" Feb 24 10:18:22 crc kubenswrapper[4698]: I0224 10:18:22.295416 4698 scope.go:117] "RemoveContainer" containerID="0fd1af0c59642907aa55721d60c59e0870d5597da7bdf99f8248f852ea5e393c" Feb 24 10:18:22 crc kubenswrapper[4698]: E0224 10:18:22.295682 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-mgh7p_openshift-ovn-kubernetes(066df704-6981-4770-a647-df52a0da50a0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" podUID="066df704-6981-4770-a647-df52a0da50a0" Feb 24 10:18:22 crc kubenswrapper[4698]: I0224 10:18:22.320467 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rpnnm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17a1338b-6385-4795-9397-74316d6599d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7xll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7xll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:18:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rpnnm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:22Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:22 crc kubenswrapper[4698]: I0224 10:18:22.341387 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:22Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:22 crc kubenswrapper[4698]: I0224 10:18:22.365965 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4539d49e9935099b59be97e672ffbe6a2a831b9261939a5afba45e16aab5c2dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:22Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:22 crc kubenswrapper[4698]: I0224 10:18:22.384583 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-nn578" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4ee0bb1-125d-4852-a54d-7dadf6177545\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67e08c23594b195088f0a11823556880d9f809097ec231acf6c4ddbcf5c085b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9ngd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0c8bc2bc5ebfb2472863808bf33f95f8aa74ed45b546ed1a1b3be4883af700e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9ngd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-nn578\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:22Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:22 crc kubenswrapper[4698]: I0224 10:18:22.415718 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7mbk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17dd9ce8-b1ca-4810-85fe-9775919eb4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac059400b5a17e1f1dc36d2fe35b5c8ace2dad5326f3933873eae644e1786c54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgnjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7mbk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:22Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:22 crc kubenswrapper[4698]: I0224 10:18:22.440297 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jlg97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90062989-bf1b-4479-89a0-f3bf0d438ac3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd966a1dd77be4accb00f38133ee9df9a0f98df5050d51996c9547a95c361cfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f570e20898252544de2e4987e3ec3baea2d46904749fc01664c969518d8babd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f570e20898252544de2e4987e3ec3baea2d46904749fc01664c969518d8babd6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86844171c4cdeecffa4831f9bba9b6d9c5eecbcc2220f880ccdb8819df60fa34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86844171c4cdeecffa4831f9bba9b6d9c5eecbcc2220f880ccdb8819df60fa34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42705a048e7832b1de855a97691620e572a7a7f38b90148e1cedd49003c649fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42705a048e7832b1de855a97691620e572a7a7f38b90148e1cedd49003c649fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5968e3b94b9d8996e9c4d4fdfab0576fcee049356dff5defd85f1a71ab652c41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5968e3b94b9d8996e9c4d4fdfab0576fcee049356dff5defd85f1a71ab652c41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05bd18aaa2469fc7380f98a513907e098a1cd45c794dae35894dc4caccaaeac8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05bd18aaa2469fc7380f98a513907e098a1cd45c794dae35894dc4caccaaeac8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c47b55214c6082bb9f8a18705983f9be95ef4c3b557d2d8f6cb8a33fa1fddd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c47b55214c6082bb9f8a18705983f9be95ef4c3b557d2d8f6cb8a33fa1fddd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jlg97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:22Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:22 crc kubenswrapper[4698]: I0224 10:18:22.474109 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"066df704-6981-4770-a647-df52a0da50a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60215d9a7dc3fbaa1b045a76c018c910f3748c5bef5325716e0a28844bc91ece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e27ae8c6aa803d58f6ff0252273d2fcbbee794c49a13fc54bfe6677b5aa6e07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7adc5b73bdd01b1e822308534c8848e154a1d05ed5367b971b59a99289387585\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2ec337c851d86c491d1ae5a667e4344ae4759f945b423d3a48838874a6eda20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://444da705b890c795bca82d2bd44ad5b71ed9bcc95a70ee5c92755679af31aa99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://096010abeb5f4fc1cf8ab2a1a3e50000365a449d0747081df923bde1be7e1213\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd1af0c59642907aa55721d60c59e0870d5597da7bdf99f8248f852ea5e393c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c4183b7a2d42eded3a4a62df9ef06d127a9a288a8e51010277cb370cf9d019e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-24T10:18:05Z\\\",\\\"message\\\":\\\"5 6551 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-controller-manager/kube-controller-manager\\\\\\\"}\\\\nI0224 10:18:04.755926 6551 services_controller.go:360] Finished syncing service kube-controller-manager on namespace openshift-kube-controller-manager for network=default : 3.573648ms\\\\nI0224 10:18:04.756185 6551 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-apiserver/apiserver\\\\\\\"}\\\\nI0224 10:18:04.756205 6551 services_controller.go:360] Finished syncing service apiserver on namespace openshift-kube-apiserver for network=default : 3.903445ms\\\\nI0224 10:18:04.756473 6551 obj_retry.go:551] Creating *factory.egressNode crc took: 8.810484ms\\\\nI0224 10:18:04.756500 6551 factory.go:1336] Added *v1.Node event handler 7\\\\nI0224 10:18:04.756534 6551 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0224 10:18:04.756802 6551 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0224 10:18:04.756875 6551 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0224 10:18:04.756903 6551 ovnkube.go:599] Stopped ovnkube\\\\nI0224 10:18:04.756935 6551 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0224 10:18:04.756995 6551 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:18:03Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fd1af0c59642907aa55721d60c59e0870d5597da7bdf99f8248f852ea5e393c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-24T10:18:21Z\\\",\\\"message\\\":\\\"/multus-additional-cni-plugins-jlg97\\\\nI0224 10:18:21.771147 6785 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI0224 10:18:21.771034 6785 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0224 10:18:21.771150 6785 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-mgh7p\\\\nI0224 10:18:21.771169 6785 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nF0224 10:18:21.771174 6785 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not y\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:18:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1288272246b8937c2880153451d797fc3328749902e2491e60c8f8f086c85288\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://363eade2263b2108feaaf0620f7f1fd910effb90ce635e5b749b59b407618443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://363eade2263b2108feaaf0620f7f1fd910effb90ce635e5b749b59b407618443\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mgh7p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:22Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:22 crc kubenswrapper[4698]: I0224 10:18:22.490161 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mb4d7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc3c474c-e869-4b47-94c5-f1ab3ce3c843\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d49238acba0219497644e528a1e99906b8e7e5d4a61033354fa8b7b9708b5e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d8kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mb4d7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:22Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:22 crc kubenswrapper[4698]: I0224 10:18:22.508301 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b70223850a461f607af8055fb157db676ed4dd9537481c41f21b8b85dc955c6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:22Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:22 crc kubenswrapper[4698]: I0224 10:18:22.531783 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e70623bb6b1c9ba54ae662592cd2861cea4181853f6595a595390c81820c287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://863cee3a2b2acf3e3138d4e13d27a2b4229d619661f97eab920e5a4ee7ae2c51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:22Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:22 crc kubenswrapper[4698]: I0224 10:18:22.552050 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:22Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:22 crc kubenswrapper[4698]: I0224 10:18:22.571011 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bhrhk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d24b42-65c5-4a01-8f4a-6f970714ab76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93b33a3866385dfb6006f052ecde4b52df1dad342d6392f0935f548b610c26e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-knwn8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://892da9f80566a48c6ace1fb4d7a16d824aad789a4ae631728a01c22a8d7b04f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-knwn8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:18:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bhrhk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:22Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:22 crc kubenswrapper[4698]: I0224 10:18:22.595114 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34fd32d5-5aed-4abb-bf14-ab1b8b02b516\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b9d9ca2f4ccd094b55e3e27cef8afddae5dc7de81912aba64ca6a6671f14a35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42a2655047e1fb057b615781d8c2ccf50f62f2a70749ef8bb214d32edaba2b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e1bb75600de7e41c8a04ba010078c753b55d05aae7a18f945c2027ba48ee30c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa3d4a95fd60ff55d1850deb923135ed607172e7676a141a5d52e6cdd60b23bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64b39341e105fbe8aa9dc4c108f6ee8a2bff33568a205e32e639b8382ab2ccb2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T10:17:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0224 10:17:08.346350 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0224 10:17:08.346447 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0224 10:17:08.346900 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705878618/tls.crt::/tmp/serving-cert-3705878618/tls.key\\\\\\\"\\\\nI0224 10:17:08.624012 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0224 10:17:08.625525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0224 10:17:08.625540 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0224 10:17:08.625560 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0224 10:17:08.625565 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0224 10:17:08.629654 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0224 10:17:08.629666 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0224 10:17:08.629711 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:17:08.629725 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:17:08.629739 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0224 10:17:08.629749 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0224 10:17:08.629758 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0224 10:17:08.629766 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0224 10:17:08.630467 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://674ed085a7507742c61fdb7dae4678b08e315a3679788c5dcbb4df97cdc27c61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9e1b116db9c76dec99d1ac4af98e5ee081f2a171a19093ba5628b676356f34b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9e1b116db9c76dec99d1ac4af98e5ee081f2a171a19093ba5628b676356f34b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:16:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:16:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:16:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:22Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:22 crc kubenswrapper[4698]: I0224 10:18:22.607927 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-29rvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9cba56db-d46e-4a34-9863-47e4dce27ca5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f62a06c2933f02c75637172be87adadd015a2aad2750f553bb2e99c38fbec74b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fk9xv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-29rvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:22Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:22 crc kubenswrapper[4698]: I0224 10:18:22.614672 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpnnm" Feb 24 10:18:22 crc kubenswrapper[4698]: I0224 10:18:22.614699 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:18:22 crc kubenswrapper[4698]: E0224 10:18:22.614923 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpnnm" podUID="17a1338b-6385-4795-9397-74316d6599d9" Feb 24 10:18:22 crc kubenswrapper[4698]: E0224 10:18:22.615041 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:18:22 crc kubenswrapper[4698]: I0224 10:18:22.629616 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff15454f-f3f9-4740-ba7f-141fc467f2bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23d201f106cf9fbd3bd2821755ea1fd87709b24155eebfab4f687defd0fd60bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://686acda68f64175c520efc4054df6bcfd32b2c98a3d8134d32e252d265520338\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T10:16:46Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0224 10:16:17.940332 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0224 10:16:17.942929 1 observer_polling.go:159] Starting file observer\\\\nI0224 10:16:18.009451 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0224 10:16:18.013067 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0224 10:16:46.691402 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0224 10:16:46.691560 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:16:17Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0a9191217045254bf454800fc32d325cc4450d0d4d0d9b6fb4bd6a438872cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9ee3d8391b55fa37cff72ad555ec89f4b12b8b5ef765979d929da0ae7cbb052\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b45bc6035a33d5e9841bd5791aeb2521dd1f93616396be15bef77dc6f5af97cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:16:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:22Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:22 crc kubenswrapper[4698]: I0224 10:18:22.650168 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:22Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:23 crc kubenswrapper[4698]: I0224 10:18:23.301356 4698 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mgh7p_066df704-6981-4770-a647-df52a0da50a0/ovnkube-controller/2.log" Feb 24 10:18:23 crc kubenswrapper[4698]: I0224 10:18:23.306275 4698 scope.go:117] "RemoveContainer" containerID="0fd1af0c59642907aa55721d60c59e0870d5597da7bdf99f8248f852ea5e393c" Feb 24 10:18:23 crc kubenswrapper[4698]: E0224 10:18:23.306463 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-mgh7p_openshift-ovn-kubernetes(066df704-6981-4770-a647-df52a0da50a0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" podUID="066df704-6981-4770-a647-df52a0da50a0" Feb 24 10:18:23 crc kubenswrapper[4698]: I0224 10:18:23.320354 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34fd32d5-5aed-4abb-bf14-ab1b8b02b516\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b9d9ca2f4ccd094b55e3e27cef8afddae5dc7de81912aba64ca6a6671f14a35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42a2655047e1fb057b615781d8c2ccf50f62f2a70749ef8bb214d32edaba2b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e1bb75600de7e41c8a04ba010078c753b55d05aae7a18f945c2027ba48ee30c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa3d4a95fd60ff55d1850deb923135ed607172e7676a141a5d52e6cdd60b23bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64b39341e105fbe8aa9dc4c108f6ee8a2bff33568a205e32e639b8382ab2ccb2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T10:17:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0224 10:17:08.346350 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0224 10:17:08.346447 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0224 10:17:08.346900 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705878618/tls.crt::/tmp/serving-cert-3705878618/tls.key\\\\\\\"\\\\nI0224 10:17:08.624012 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0224 10:17:08.625525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0224 10:17:08.625540 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0224 10:17:08.625560 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0224 10:17:08.625565 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0224 10:17:08.629654 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0224 10:17:08.629666 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0224 10:17:08.629711 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:17:08.629725 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:17:08.629739 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0224 10:17:08.629749 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0224 10:17:08.629758 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0224 10:17:08.629766 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0224 10:17:08.630467 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://674ed085a7507742c61fdb7dae4678b08e315a3679788c5dcbb4df97cdc27c61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9e1b116db9c76dec99d1ac4af98e5ee081f2a171a19093ba5628b676356f34b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9e1b116db9c76dec99d1ac4af98e5ee081f2a171a19093ba5628b676356f34b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:16:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:16:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:16:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:23Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:23 crc kubenswrapper[4698]: I0224 10:18:23.334797 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-29rvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9cba56db-d46e-4a34-9863-47e4dce27ca5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f62a06c2933f02c75637172be87adadd015a2aad2750f553bb2e99c38fbec74b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fk9xv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-29rvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:23Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:23 crc kubenswrapper[4698]: I0224 10:18:23.352378 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff15454f-f3f9-4740-ba7f-141fc467f2bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23d201f106cf9fbd3bd2821755ea1fd87709b24155eebfab4f687defd0fd60bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://686acda68f64175c520efc4054df6bcfd32b2c98a3d8134d32e252d265520338\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T10:16:46Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0224 10:16:17.940332 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0224 10:16:17.942929 1 observer_polling.go:159] Starting file observer\\\\nI0224 10:16:18.009451 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0224 10:16:18.013067 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0224 10:16:46.691402 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0224 10:16:46.691560 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:16:17Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0a9191217045254bf454800fc32d325cc4450d0d4d0d9b6fb4bd6a438872cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9ee3d8391b55fa37cff72ad555ec89f4b12b8b5ef765979d929da0ae7cbb052\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b45bc6035a33d5e9841bd5791aeb2521dd1f93616396be15bef77dc6f5af97cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:16:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:23Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:23 crc kubenswrapper[4698]: I0224 10:18:23.373466 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:23Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:23 crc kubenswrapper[4698]: I0224 10:18:23.390380 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:23Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:23 crc kubenswrapper[4698]: I0224 10:18:23.410078 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4539d49e9935099b59be97e672ffbe6a2a831b9261939a5afba45e16aab5c2dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:23Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:23 crc kubenswrapper[4698]: I0224 10:18:23.424516 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-nn578" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4ee0bb1-125d-4852-a54d-7dadf6177545\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67e08c23594b195088f0a11823556880d9f809097ec231acf6c4ddbcf5c085b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9ngd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0c8bc2bc5ebfb2472863808bf33f95f8aa74ed45b546ed1a1b3be4883af700e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9ngd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-nn578\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:23Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:23 crc kubenswrapper[4698]: I0224 10:18:23.438438 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7mbk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17dd9ce8-b1ca-4810-85fe-9775919eb4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac059400b5a17e1f1dc36d2fe35b5c8ace2dad5326f3933873eae644e1786c54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgnjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7mbk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:23Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:23 crc kubenswrapper[4698]: I0224 10:18:23.452760 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rpnnm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17a1338b-6385-4795-9397-74316d6599d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7xll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7xll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:18:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rpnnm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:23Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:23 crc kubenswrapper[4698]: I0224 10:18:23.468739 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b70223850a461f607af8055fb157db676ed4dd9537481c41f21b8b85dc955c6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:23Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:23 crc kubenswrapper[4698]: I0224 10:18:23.471176 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:18:23 crc kubenswrapper[4698]: E0224 10:18:23.471383 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:18:55.471354451 +0000 UTC m=+160.584968692 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:18:23 crc kubenswrapper[4698]: I0224 10:18:23.471470 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:18:23 crc kubenswrapper[4698]: I0224 10:18:23.471525 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:18:23 crc kubenswrapper[4698]: E0224 10:18:23.471634 4698 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 24 10:18:23 crc kubenswrapper[4698]: E0224 10:18:23.471752 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-24 10:18:55.4717245 +0000 UTC m=+160.585338771 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 24 10:18:23 crc kubenswrapper[4698]: E0224 10:18:23.471654 4698 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 24 10:18:23 crc kubenswrapper[4698]: E0224 10:18:23.471894 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-24 10:18:55.471855313 +0000 UTC m=+160.585469594 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 24 10:18:23 crc kubenswrapper[4698]: I0224 10:18:23.487080 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e70623bb6b1c9ba54ae662592cd2861cea4181853f6595a595390c81820c287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://863cee3a2b2acf3e3138d4e13d27a2b4229d619661f97eab920e5a4ee7ae2c51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:23Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:23 crc kubenswrapper[4698]: I0224 10:18:23.503548 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:23Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:23 crc kubenswrapper[4698]: I0224 10:18:23.541745 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jlg97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90062989-bf1b-4479-89a0-f3bf0d438ac3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd966a1dd77be4accb00f38133ee9df9a0f98df5050d51996c9547a95c361cfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f570e20898252544de2e4987e3ec3baea2d46904749fc01664c969518d8babd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f570e20898252544de2e4987e3ec3baea2d46904749fc01664c969518d8babd6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86844171c4cdeecffa4831f9bba9b6d9c5eecbcc2220f880ccdb8819df60fa34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86844171c4cdeecffa4831f9bba9b6d9c5eecbcc2220f880ccdb8819df60fa34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42705a048e7832b1de855a97691620e572a7a7f38b90148e1cedd49003c649fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42705a048e7832b1de855a97691620e572a7a7f38b90148e1cedd49003c649fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5968e3b94b9d8996e9c4d4fdfab0576fcee049356dff5defd85f1a71ab652c41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5968e3b94b9d8996e9c4d4fdfab0576fcee049356dff5defd85f1a71ab652c41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05bd18aaa2469fc7380f98a513907e098a1cd45c794dae35894dc4caccaaeac8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05bd18aaa2469fc7380f98a513907e098a1cd45c794dae35894dc4caccaaeac8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c47b55214c6082bb9f8a18705983f9be95ef4c3b557d2d8f6cb8a33fa1fddd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c47b55214c6082bb9f8a18705983f9be95ef4c3b557d2d8f6cb8a33fa1fddd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jlg97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:23Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:23 crc kubenswrapper[4698]: I0224 10:18:23.573033 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:18:23 crc kubenswrapper[4698]: I0224 10:18:23.573099 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:18:23 crc kubenswrapper[4698]: E0224 10:18:23.573223 4698 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 24 10:18:23 crc kubenswrapper[4698]: E0224 10:18:23.573243 4698 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 24 10:18:23 crc kubenswrapper[4698]: E0224 10:18:23.573278 4698 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 10:18:23 crc kubenswrapper[4698]: E0224 10:18:23.573223 4698 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 24 10:18:23 crc kubenswrapper[4698]: E0224 10:18:23.573308 4698 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 24 10:18:23 crc kubenswrapper[4698]: E0224 10:18:23.573321 4698 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 10:18:23 crc kubenswrapper[4698]: E0224 10:18:23.573333 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-24 10:18:55.5733139 +0000 UTC m=+160.686928151 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 10:18:23 crc kubenswrapper[4698]: E0224 10:18:23.573365 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-24 10:18:55.573353431 +0000 UTC m=+160.686967672 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 10:18:23 crc kubenswrapper[4698]: I0224 10:18:23.584493 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"066df704-6981-4770-a647-df52a0da50a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60215d9a7dc3fbaa1b045a76c018c910f3748c5bef5325716e0a28844bc91ece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e27ae8c6aa803d58f6ff0252273d2fcbbee794c49a13fc54bfe6677b5aa6e07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7adc5b73bdd01b1e822308534c8848e154a1d05ed5367b971b59a99289387585\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2ec337c851d86c491d1ae5a667e4344ae4759f945b423d3a48838874a6eda20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://444da705b890c795bca82d2bd44ad5b71ed9bcc95a70ee5c92755679af31aa99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://096010abeb5f4fc1cf8ab2a1a3e50000365a449d0747081df923bde1be7e1213\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd1af0c59642907aa55721d60c59e0870d5597da7bdf99f8248f852ea5e393c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fd1af0c59642907aa55721d60c59e0870d5597da7bdf99f8248f852ea5e393c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-24T10:18:21Z\\\",\\\"message\\\":\\\"/multus-additional-cni-plugins-jlg97\\\\nI0224 10:18:21.771147 6785 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI0224 10:18:21.771034 6785 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0224 10:18:21.771150 6785 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-mgh7p\\\\nI0224 10:18:21.771169 6785 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nF0224 10:18:21.771174 6785 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not y\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:18:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-mgh7p_openshift-ovn-kubernetes(066df704-6981-4770-a647-df52a0da50a0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1288272246b8937c2880153451d797fc3328749902e2491e60c8f8f086c85288\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://363eade2263b2108feaaf0620f7f1fd910effb90ce635e5b749b59b407618443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://363eade2263b2108feaaf0620f7f1fd910effb90ce635e5b749b59b407618443\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mgh7p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:23Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:23 crc kubenswrapper[4698]: I0224 10:18:23.602288 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mb4d7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc3c474c-e869-4b47-94c5-f1ab3ce3c843\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d49238acba0219497644e528a1e99906b8e7e5d4a61033354fa8b7b9708b5e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d8kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mb4d7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:23Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:23 crc kubenswrapper[4698]: I0224 10:18:23.614480 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:18:23 crc kubenswrapper[4698]: I0224 10:18:23.614511 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:18:23 crc kubenswrapper[4698]: E0224 10:18:23.614963 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:18:23 crc kubenswrapper[4698]: E0224 10:18:23.614991 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:18:23 crc kubenswrapper[4698]: I0224 10:18:23.620355 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bhrhk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d24b42-65c5-4a01-8f4a-6f970714ab76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93b33a3866385dfb6006f052ecde4b52df1dad342d6392f0935f548b610c26e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-knwn8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://892da9f80566a48c6ace1fb4d7a16d824aad789a4ae631728a01c22a8d7b04f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-knwn8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:18:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bhrhk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:23Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:24 crc kubenswrapper[4698]: I0224 10:18:24.613860 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpnnm" Feb 24 10:18:24 crc kubenswrapper[4698]: E0224 10:18:24.614281 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpnnm" podUID="17a1338b-6385-4795-9397-74316d6599d9" Feb 24 10:18:24 crc kubenswrapper[4698]: I0224 10:18:24.613945 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:18:24 crc kubenswrapper[4698]: E0224 10:18:24.614354 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:18:25 crc kubenswrapper[4698]: I0224 10:18:25.614743 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:18:25 crc kubenswrapper[4698]: I0224 10:18:25.614802 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:18:25 crc kubenswrapper[4698]: E0224 10:18:25.615541 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:18:25 crc kubenswrapper[4698]: E0224 10:18:25.615692 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:18:25 crc kubenswrapper[4698]: I0224 10:18:25.637822 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34fd32d5-5aed-4abb-bf14-ab1b8b02b516\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b9d9ca2f4ccd094b55e3e27cef8afddae5dc7de81912aba64ca6a6671f14a35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42a2655047e1fb057b615781d8c2ccf50f62f2a70749ef8bb214d32edaba2b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e1bb75600de7e41c8a04ba010078c753b55d05aae7a18f945c2027ba48ee30c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa3d4a95fd60ff55d1850deb923135ed607172e7676a141a5d52e6cdd60b23bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64b39341e105fbe8aa9dc4c108f6ee8a2bff33568a205e32e639b8382ab2ccb2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T10:17:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0224 10:17:08.346350 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0224 10:17:08.346447 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0224 10:17:08.346900 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705878618/tls.crt::/tmp/serving-cert-3705878618/tls.key\\\\\\\"\\\\nI0224 10:17:08.624012 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0224 10:17:08.625525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0224 10:17:08.625540 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0224 10:17:08.625560 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0224 10:17:08.625565 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0224 10:17:08.629654 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0224 10:17:08.629666 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0224 10:17:08.629711 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:17:08.629725 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:17:08.629739 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0224 10:17:08.629749 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0224 10:17:08.629758 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0224 10:17:08.629766 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0224 10:17:08.630467 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://674ed085a7507742c61fdb7dae4678b08e315a3679788c5dcbb4df97cdc27c61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9e1b116db9c76dec99d1ac4af98e5ee081f2a171a19093ba5628b676356f34b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9e1b116db9c76dec99d1ac4af98e5ee081f2a171a19093ba5628b676356f34b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:16:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:16:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:16:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:25Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:25 crc kubenswrapper[4698]: I0224 10:18:25.657328 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-29rvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9cba56db-d46e-4a34-9863-47e4dce27ca5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f62a06c2933f02c75637172be87adadd015a2aad2750f553bb2e99c38fbec74b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fk9xv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-29rvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:25Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:25 crc kubenswrapper[4698]: I0224 10:18:25.672811 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff15454f-f3f9-4740-ba7f-141fc467f2bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23d201f106cf9fbd3bd2821755ea1fd87709b24155eebfab4f687defd0fd60bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://686acda68f64175c520efc4054df6bcfd32b2c98a3d8134d32e252d265520338\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T10:16:46Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0224 10:16:17.940332 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0224 10:16:17.942929 1 observer_polling.go:159] Starting file observer\\\\nI0224 10:16:18.009451 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0224 10:16:18.013067 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0224 10:16:46.691402 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0224 10:16:46.691560 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:16:17Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0a9191217045254bf454800fc32d325cc4450d0d4d0d9b6fb4bd6a438872cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9ee3d8391b55fa37cff72ad555ec89f4b12b8b5ef765979d929da0ae7cbb052\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b45bc6035a33d5e9841bd5791aeb2521dd1f93616396be15bef77dc6f5af97cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:16:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:25Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:25 crc kubenswrapper[4698]: I0224 10:18:25.693152 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:25Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:25 crc kubenswrapper[4698]: I0224 10:18:25.710627 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:25Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:25 crc kubenswrapper[4698]: I0224 10:18:25.732943 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4539d49e9935099b59be97e672ffbe6a2a831b9261939a5afba45e16aab5c2dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:25Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:25 crc kubenswrapper[4698]: E0224 10:18:25.739081 4698 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 24 10:18:25 crc kubenswrapper[4698]: I0224 10:18:25.753055 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-nn578" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4ee0bb1-125d-4852-a54d-7dadf6177545\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67e08c23594b195088f0a11823556880d9f809097ec231acf6c4ddbcf5c085b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9ngd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0c8bc2bc5ebfb2472863808bf33f95f8aa74ed45b546ed1a1b3be4883af700e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9ngd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-nn578\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:25Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:25 crc kubenswrapper[4698]: I0224 10:18:25.776391 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7mbk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17dd9ce8-b1ca-4810-85fe-9775919eb4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac059400b5a17e1f1dc36d2fe35b5c8ace2dad5326f3933873eae644e1786c54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgnjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7mbk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:25Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:25 crc kubenswrapper[4698]: I0224 10:18:25.793677 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rpnnm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17a1338b-6385-4795-9397-74316d6599d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7xll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7xll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:18:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rpnnm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:25Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:25 crc kubenswrapper[4698]: I0224 10:18:25.810098 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mb4d7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc3c474c-e869-4b47-94c5-f1ab3ce3c843\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d49238acba0219497644e528a1e99906b8e7e5d4a61033354fa8b7b9708b5e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d8kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mb4d7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:25Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:25 crc kubenswrapper[4698]: I0224 10:18:25.829237 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b70223850a461f607af8055fb157db676ed4dd9537481c41f21b8b85dc955c6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:25Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:25 crc kubenswrapper[4698]: I0224 10:18:25.848762 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e70623bb6b1c9ba54ae662592cd2861cea4181853f6595a595390c81820c287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://863cee3a2b2acf3e3138d4e13d27a2b4229d619661f97eab920e5a4ee7ae2c51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:25Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:25 crc kubenswrapper[4698]: I0224 10:18:25.867525 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:25Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:25 crc kubenswrapper[4698]: I0224 10:18:25.891156 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jlg97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90062989-bf1b-4479-89a0-f3bf0d438ac3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd966a1dd77be4accb00f38133ee9df9a0f98df5050d51996c9547a95c361cfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f570e20898252544de2e4987e3ec3baea2d46904749fc01664c969518d8babd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f570e20898252544de2e4987e3ec3baea2d46904749fc01664c969518d8babd6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86844171c4cdeecffa4831f9bba9b6d9c5eecbcc2220f880ccdb8819df60fa34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86844171c4cdeecffa4831f9bba9b6d9c5eecbcc2220f880ccdb8819df60fa34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42705a048e7832b1de855a97691620e572a7a7f38b90148e1cedd49003c649fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42705a048e7832b1de855a97691620e572a7a7f38b90148e1cedd49003c649fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5968e3b94b9d8996e9c4d4fdfab0576fcee049356dff5defd85f1a71ab652c41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5968e3b94b9d8996e9c4d4fdfab0576fcee049356dff5defd85f1a71ab652c41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05bd18aaa2469fc7380f98a513907e098a1cd45c794dae35894dc4caccaaeac8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05bd18aaa2469fc7380f98a513907e098a1cd45c794dae35894dc4caccaaeac8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c47b55214c6082bb9f8a18705983f9be95ef4c3b557d2d8f6cb8a33fa1fddd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c47b55214c6082bb9f8a18705983f9be95ef4c3b557d2d8f6cb8a33fa1fddd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jlg97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:25Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:25 crc kubenswrapper[4698]: I0224 10:18:25.923541 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"066df704-6981-4770-a647-df52a0da50a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60215d9a7dc3fbaa1b045a76c018c910f3748c5bef5325716e0a28844bc91ece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e27ae8c6aa803d58f6ff0252273d2fcbbee794c49a13fc54bfe6677b5aa6e07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7adc5b73bdd01b1e822308534c8848e154a1d05ed5367b971b59a99289387585\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2ec337c851d86c491d1ae5a667e4344ae4759f945b423d3a48838874a6eda20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://444da705b890c795bca82d2bd44ad5b71ed9bcc95a70ee5c92755679af31aa99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://096010abeb5f4fc1cf8ab2a1a3e50000365a449d0747081df923bde1be7e1213\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd1af0c59642907aa55721d60c59e0870d5597da7bdf99f8248f852ea5e393c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fd1af0c59642907aa55721d60c59e0870d5597da7bdf99f8248f852ea5e393c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-24T10:18:21Z\\\",\\\"message\\\":\\\"/multus-additional-cni-plugins-jlg97\\\\nI0224 10:18:21.771147 6785 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI0224 10:18:21.771034 6785 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0224 10:18:21.771150 6785 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-mgh7p\\\\nI0224 10:18:21.771169 6785 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nF0224 10:18:21.771174 6785 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not y\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:18:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-mgh7p_openshift-ovn-kubernetes(066df704-6981-4770-a647-df52a0da50a0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1288272246b8937c2880153451d797fc3328749902e2491e60c8f8f086c85288\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://363eade2263b2108feaaf0620f7f1fd910effb90ce635e5b749b59b407618443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://363eade2263b2108feaaf0620f7f1fd910effb90ce635e5b749b59b407618443\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mgh7p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:25Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:25 crc kubenswrapper[4698]: I0224 10:18:25.945009 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bhrhk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d24b42-65c5-4a01-8f4a-6f970714ab76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93b33a3866385dfb6006f052ecde4b52df1dad342d6392f0935f548b610c26e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-knwn8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://892da9f80566a48c6ace1fb4d7a16d824aad789a4ae631728a01c22a8d7b04f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-knwn8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:18:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bhrhk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:25Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:26 crc kubenswrapper[4698]: I0224 10:18:26.614515 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:18:26 crc kubenswrapper[4698]: I0224 10:18:26.614574 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpnnm" Feb 24 10:18:26 crc kubenswrapper[4698]: E0224 10:18:26.614666 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:18:26 crc kubenswrapper[4698]: E0224 10:18:26.614806 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpnnm" podUID="17a1338b-6385-4795-9397-74316d6599d9" Feb 24 10:18:27 crc kubenswrapper[4698]: I0224 10:18:27.614083 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:18:27 crc kubenswrapper[4698]: I0224 10:18:27.614175 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:18:27 crc kubenswrapper[4698]: E0224 10:18:27.614347 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:18:27 crc kubenswrapper[4698]: E0224 10:18:27.614468 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:18:28 crc kubenswrapper[4698]: I0224 10:18:28.040240 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:28 crc kubenswrapper[4698]: I0224 10:18:28.040289 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:28 crc kubenswrapper[4698]: I0224 10:18:28.040300 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:28 crc kubenswrapper[4698]: I0224 10:18:28.040315 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:28 crc kubenswrapper[4698]: I0224 10:18:28.040324 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:28Z","lastTransitionTime":"2026-02-24T10:18:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:28 crc kubenswrapper[4698]: E0224 10:18:28.060110 4698 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b118f46-32f0-479c-9931-37b2bbb76922\\\",\\\"systemUUID\\\":\\\"b9d2441b-c8c3-476a-9c48-bba682d9b98e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:28Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:28 crc kubenswrapper[4698]: I0224 10:18:28.069331 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:28 crc kubenswrapper[4698]: I0224 10:18:28.069404 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:28 crc kubenswrapper[4698]: I0224 10:18:28.069423 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:28 crc kubenswrapper[4698]: I0224 10:18:28.069448 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:28 crc kubenswrapper[4698]: I0224 10:18:28.069468 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:28Z","lastTransitionTime":"2026-02-24T10:18:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:28 crc kubenswrapper[4698]: E0224 10:18:28.090456 4698 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b118f46-32f0-479c-9931-37b2bbb76922\\\",\\\"systemUUID\\\":\\\"b9d2441b-c8c3-476a-9c48-bba682d9b98e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:28Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:28 crc kubenswrapper[4698]: I0224 10:18:28.095402 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:28 crc kubenswrapper[4698]: I0224 10:18:28.095484 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:28 crc kubenswrapper[4698]: I0224 10:18:28.095501 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:28 crc kubenswrapper[4698]: I0224 10:18:28.095527 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:28 crc kubenswrapper[4698]: I0224 10:18:28.095544 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:28Z","lastTransitionTime":"2026-02-24T10:18:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:28 crc kubenswrapper[4698]: E0224 10:18:28.114839 4698 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b118f46-32f0-479c-9931-37b2bbb76922\\\",\\\"systemUUID\\\":\\\"b9d2441b-c8c3-476a-9c48-bba682d9b98e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:28Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:28 crc kubenswrapper[4698]: I0224 10:18:28.120043 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:28 crc kubenswrapper[4698]: I0224 10:18:28.120250 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:28 crc kubenswrapper[4698]: I0224 10:18:28.120423 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:28 crc kubenswrapper[4698]: I0224 10:18:28.120576 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:28 crc kubenswrapper[4698]: I0224 10:18:28.120708 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:28Z","lastTransitionTime":"2026-02-24T10:18:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:28 crc kubenswrapper[4698]: E0224 10:18:28.142087 4698 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b118f46-32f0-479c-9931-37b2bbb76922\\\",\\\"systemUUID\\\":\\\"b9d2441b-c8c3-476a-9c48-bba682d9b98e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:28Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:28 crc kubenswrapper[4698]: I0224 10:18:28.146431 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:28 crc kubenswrapper[4698]: I0224 10:18:28.146483 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:28 crc kubenswrapper[4698]: I0224 10:18:28.146502 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:28 crc kubenswrapper[4698]: I0224 10:18:28.146526 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:28 crc kubenswrapper[4698]: I0224 10:18:28.146543 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:28Z","lastTransitionTime":"2026-02-24T10:18:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:28 crc kubenswrapper[4698]: E0224 10:18:28.166767 4698 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b118f46-32f0-479c-9931-37b2bbb76922\\\",\\\"systemUUID\\\":\\\"b9d2441b-c8c3-476a-9c48-bba682d9b98e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:28Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:28 crc kubenswrapper[4698]: E0224 10:18:28.167296 4698 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 24 10:18:28 crc kubenswrapper[4698]: I0224 10:18:28.614386 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:18:28 crc kubenswrapper[4698]: I0224 10:18:28.614397 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpnnm" Feb 24 10:18:28 crc kubenswrapper[4698]: E0224 10:18:28.614588 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:18:28 crc kubenswrapper[4698]: E0224 10:18:28.614786 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpnnm" podUID="17a1338b-6385-4795-9397-74316d6599d9" Feb 24 10:18:29 crc kubenswrapper[4698]: I0224 10:18:29.613756 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:18:29 crc kubenswrapper[4698]: E0224 10:18:29.613925 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:18:29 crc kubenswrapper[4698]: I0224 10:18:29.614329 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:18:29 crc kubenswrapper[4698]: E0224 10:18:29.614469 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:18:30 crc kubenswrapper[4698]: I0224 10:18:30.614087 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpnnm" Feb 24 10:18:30 crc kubenswrapper[4698]: I0224 10:18:30.614130 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:18:30 crc kubenswrapper[4698]: E0224 10:18:30.614193 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpnnm" podUID="17a1338b-6385-4795-9397-74316d6599d9" Feb 24 10:18:30 crc kubenswrapper[4698]: E0224 10:18:30.614364 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:18:30 crc kubenswrapper[4698]: E0224 10:18:30.740149 4698 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 24 10:18:31 crc kubenswrapper[4698]: I0224 10:18:31.614654 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:18:31 crc kubenswrapper[4698]: I0224 10:18:31.614654 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:18:31 crc kubenswrapper[4698]: E0224 10:18:31.614805 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:18:31 crc kubenswrapper[4698]: E0224 10:18:31.614952 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:18:32 crc kubenswrapper[4698]: I0224 10:18:32.613655 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:18:32 crc kubenswrapper[4698]: I0224 10:18:32.613701 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpnnm" Feb 24 10:18:32 crc kubenswrapper[4698]: E0224 10:18:32.613997 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:18:32 crc kubenswrapper[4698]: E0224 10:18:32.614177 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpnnm" podUID="17a1338b-6385-4795-9397-74316d6599d9" Feb 24 10:18:32 crc kubenswrapper[4698]: I0224 10:18:32.632814 4698 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Feb 24 10:18:33 crc kubenswrapper[4698]: I0224 10:18:33.614257 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:18:33 crc kubenswrapper[4698]: I0224 10:18:33.614388 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:18:33 crc kubenswrapper[4698]: E0224 10:18:33.614434 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:18:33 crc kubenswrapper[4698]: E0224 10:18:33.614611 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:18:34 crc kubenswrapper[4698]: I0224 10:18:34.614506 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:18:34 crc kubenswrapper[4698]: I0224 10:18:34.614577 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpnnm" Feb 24 10:18:34 crc kubenswrapper[4698]: E0224 10:18:34.614695 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:18:34 crc kubenswrapper[4698]: E0224 10:18:34.615518 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpnnm" podUID="17a1338b-6385-4795-9397-74316d6599d9" Feb 24 10:18:34 crc kubenswrapper[4698]: I0224 10:18:34.616030 4698 scope.go:117] "RemoveContainer" containerID="0fd1af0c59642907aa55721d60c59e0870d5597da7bdf99f8248f852ea5e393c" Feb 24 10:18:34 crc kubenswrapper[4698]: E0224 10:18:34.616405 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-mgh7p_openshift-ovn-kubernetes(066df704-6981-4770-a647-df52a0da50a0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" podUID="066df704-6981-4770-a647-df52a0da50a0" Feb 24 10:18:35 crc kubenswrapper[4698]: I0224 10:18:35.613761 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:18:35 crc kubenswrapper[4698]: E0224 10:18:35.613980 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:18:35 crc kubenswrapper[4698]: I0224 10:18:35.614032 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:18:35 crc kubenswrapper[4698]: E0224 10:18:35.614189 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:18:35 crc kubenswrapper[4698]: I0224 10:18:35.635248 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff15454f-f3f9-4740-ba7f-141fc467f2bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23d201f106cf9fbd3bd2821755ea1fd87709b24155eebfab4f687defd0fd60bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://686acda68f64175c520efc4054df6bcfd32b2c98a3d8134d32e252d265520338\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T10:16:46Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0224 10:16:17.940332 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0224 10:16:17.942929 1 observer_polling.go:159] Starting file observer\\\\nI0224 10:16:18.009451 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0224 10:16:18.013067 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0224 10:16:46.691402 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0224 10:16:46.691560 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:16:17Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0a9191217045254bf454800fc32d325cc4450d0d4d0d9b6fb4bd6a438872cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9ee3d8391b55fa37cff72ad555ec89f4b12b8b5ef765979d929da0ae7cbb052\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b45bc6035a33d5e9841bd5791aeb2521dd1f93616396be15bef77dc6f5af97cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:16:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:35Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:35 crc kubenswrapper[4698]: I0224 10:18:35.656051 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:35Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:35 crc kubenswrapper[4698]: I0224 10:18:35.677976 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7mbk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17dd9ce8-b1ca-4810-85fe-9775919eb4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac059400b5a17e1f1dc36d2fe35b5c8ace2dad5326f3933873eae644e1786c54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgnjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7mbk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:35Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:35 crc kubenswrapper[4698]: I0224 10:18:35.692167 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rpnnm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17a1338b-6385-4795-9397-74316d6599d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7xll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7xll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:18:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rpnnm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:35Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:35 crc kubenswrapper[4698]: I0224 10:18:35.713204 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:35Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:35 crc kubenswrapper[4698]: I0224 10:18:35.731585 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4539d49e9935099b59be97e672ffbe6a2a831b9261939a5afba45e16aab5c2dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:35Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:35 crc kubenswrapper[4698]: E0224 10:18:35.740752 4698 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 24 10:18:35 crc kubenswrapper[4698]: I0224 10:18:35.749244 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-nn578" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4ee0bb1-125d-4852-a54d-7dadf6177545\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67e08c23594b195088f0a11823556880d9f809097ec231acf6c4ddbcf5c085b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9ngd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0c8bc2bc5ebfb2472863808bf33f95f8aa74ed45b546ed1a1b3be4883af700e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9ngd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-nn578\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:35Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:35 crc kubenswrapper[4698]: I0224 10:18:35.768715 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:35Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:35 crc kubenswrapper[4698]: I0224 10:18:35.786004 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jlg97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90062989-bf1b-4479-89a0-f3bf0d438ac3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd966a1dd77be4accb00f38133ee9df9a0f98df5050d51996c9547a95c361cfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f570e20898252544de2e4987e3ec3baea2d46904749fc01664c969518d8babd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f570e20898252544de2e4987e3ec3baea2d46904749fc01664c969518d8babd6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86844171c4cdeecffa4831f9bba9b6d9c5eecbcc2220f880ccdb8819df60fa34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86844171c4cdeecffa4831f9bba9b6d9c5eecbcc2220f880ccdb8819df60fa34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42705a048e7832b1de855a97691620e572a7a7f38b90148e1cedd49003c649fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42705a048e7832b1de855a97691620e572a7a7f38b90148e1cedd49003c649fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5968e3b94b9d8996e9c4d4fdfab0576fcee049356dff5defd85f1a71ab652c41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5968e3b94b9d8996e9c4d4fdfab0576fcee049356dff5defd85f1a71ab652c41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05bd18aaa2469fc7380f98a513907e098a1cd45c794dae35894dc4caccaaeac8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05bd18aaa2469fc7380f98a513907e098a1cd45c794dae35894dc4caccaaeac8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c47b55214c6082bb9f8a18705983f9be95ef4c3b557d2d8f6cb8a33fa1fddd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c47b55214c6082bb9f8a18705983f9be95ef4c3b557d2d8f6cb8a33fa1fddd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jlg97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:35Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:35 crc kubenswrapper[4698]: I0224 10:18:35.807402 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"066df704-6981-4770-a647-df52a0da50a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60215d9a7dc3fbaa1b045a76c018c910f3748c5bef5325716e0a28844bc91ece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e27ae8c6aa803d58f6ff0252273d2fcbbee794c49a13fc54bfe6677b5aa6e07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7adc5b73bdd01b1e822308534c8848e154a1d05ed5367b971b59a99289387585\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2ec337c851d86c491d1ae5a667e4344ae4759f945b423d3a48838874a6eda20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://444da705b890c795bca82d2bd44ad5b71ed9bcc95a70ee5c92755679af31aa99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://096010abeb5f4fc1cf8ab2a1a3e50000365a449d0747081df923bde1be7e1213\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd1af0c59642907aa55721d60c59e0870d5597da7bdf99f8248f852ea5e393c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fd1af0c59642907aa55721d60c59e0870d5597da7bdf99f8248f852ea5e393c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-24T10:18:21Z\\\",\\\"message\\\":\\\"/multus-additional-cni-plugins-jlg97\\\\nI0224 10:18:21.771147 6785 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI0224 10:18:21.771034 6785 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0224 10:18:21.771150 6785 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-mgh7p\\\\nI0224 10:18:21.771169 6785 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nF0224 10:18:21.771174 6785 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not y\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:18:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-mgh7p_openshift-ovn-kubernetes(066df704-6981-4770-a647-df52a0da50a0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1288272246b8937c2880153451d797fc3328749902e2491e60c8f8f086c85288\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://363eade2263b2108feaaf0620f7f1fd910effb90ce635e5b749b59b407618443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://363eade2263b2108feaaf0620f7f1fd910effb90ce635e5b749b59b407618443\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mgh7p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:35Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:35 crc kubenswrapper[4698]: I0224 10:18:35.826198 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mb4d7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc3c474c-e869-4b47-94c5-f1ab3ce3c843\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d49238acba0219497644e528a1e99906b8e7e5d4a61033354fa8b7b9708b5e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d8kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mb4d7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:35Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:35 crc kubenswrapper[4698]: I0224 10:18:35.844205 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e2735c5-8b7a-424e-ba7f-8fe39da1e460\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2be309f6fc6bdf6f229b4a6ee32621f1385e3addb1c6655f4ee94a9e0f07e7e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29426bbee48683a1da8ffc61612543b337ccf61119a3617bcbbb475f75dac606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://532e8024f6ab49ca211330f56da50af0f46daf4569ad97723d35aa97076cde4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cadab6d7d113b12b60104344c27a04acf451f6627c3d62ad17b9132d63b6e974\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cadab6d7d113b12b60104344c27a04acf451f6627c3d62ad17b9132d63b6e974\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:16:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:16:16Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:16:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:35Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:35 crc kubenswrapper[4698]: I0224 10:18:35.857861 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b70223850a461f607af8055fb157db676ed4dd9537481c41f21b8b85dc955c6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:35Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:35 crc kubenswrapper[4698]: I0224 10:18:35.870651 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e70623bb6b1c9ba54ae662592cd2861cea4181853f6595a595390c81820c287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://863cee3a2b2acf3e3138d4e13d27a2b4229d619661f97eab920e5a4ee7ae2c51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:35Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:35 crc kubenswrapper[4698]: I0224 10:18:35.882069 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bhrhk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d24b42-65c5-4a01-8f4a-6f970714ab76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93b33a3866385dfb6006f052ecde4b52df1dad342d6392f0935f548b610c26e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-knwn8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://892da9f80566a48c6ace1fb4d7a16d824aad789a4ae631728a01c22a8d7b04f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-knwn8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:18:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bhrhk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:35Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:35 crc kubenswrapper[4698]: I0224 10:18:35.899746 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34fd32d5-5aed-4abb-bf14-ab1b8b02b516\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b9d9ca2f4ccd094b55e3e27cef8afddae5dc7de81912aba64ca6a6671f14a35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42a2655047e1fb057b615781d8c2ccf50f62f2a70749ef8bb214d32edaba2b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e1bb75600de7e41c8a04ba010078c753b55d05aae7a18f945c2027ba48ee30c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa3d4a95fd60ff55d1850deb923135ed607172e7676a141a5d52e6cdd60b23bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64b39341e105fbe8aa9dc4c108f6ee8a2bff33568a205e32e639b8382ab2ccb2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T10:17:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0224 10:17:08.346350 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0224 10:17:08.346447 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0224 10:17:08.346900 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705878618/tls.crt::/tmp/serving-cert-3705878618/tls.key\\\\\\\"\\\\nI0224 10:17:08.624012 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0224 10:17:08.625525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0224 10:17:08.625540 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0224 10:17:08.625560 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0224 10:17:08.625565 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0224 10:17:08.629654 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0224 10:17:08.629666 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0224 10:17:08.629711 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:17:08.629725 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:17:08.629739 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0224 10:17:08.629749 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0224 10:17:08.629758 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0224 10:17:08.629766 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0224 10:17:08.630467 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://674ed085a7507742c61fdb7dae4678b08e315a3679788c5dcbb4df97cdc27c61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9e1b116db9c76dec99d1ac4af98e5ee081f2a171a19093ba5628b676356f34b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9e1b116db9c76dec99d1ac4af98e5ee081f2a171a19093ba5628b676356f34b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:16:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:16:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:16:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:35Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:35 crc kubenswrapper[4698]: I0224 10:18:35.913031 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-29rvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9cba56db-d46e-4a34-9863-47e4dce27ca5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f62a06c2933f02c75637172be87adadd015a2aad2750f553bb2e99c38fbec74b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fk9xv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-29rvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:35Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:36 crc kubenswrapper[4698]: I0224 10:18:36.613815 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpnnm" Feb 24 10:18:36 crc kubenswrapper[4698]: I0224 10:18:36.613817 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:18:36 crc kubenswrapper[4698]: E0224 10:18:36.613986 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpnnm" podUID="17a1338b-6385-4795-9397-74316d6599d9" Feb 24 10:18:36 crc kubenswrapper[4698]: E0224 10:18:36.614112 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:18:37 crc kubenswrapper[4698]: I0224 10:18:37.331003 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/17a1338b-6385-4795-9397-74316d6599d9-metrics-certs\") pod \"network-metrics-daemon-rpnnm\" (UID: \"17a1338b-6385-4795-9397-74316d6599d9\") " pod="openshift-multus/network-metrics-daemon-rpnnm" Feb 24 10:18:37 crc kubenswrapper[4698]: E0224 10:18:37.331212 4698 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 24 10:18:37 crc kubenswrapper[4698]: E0224 10:18:37.331361 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/17a1338b-6385-4795-9397-74316d6599d9-metrics-certs podName:17a1338b-6385-4795-9397-74316d6599d9 nodeName:}" failed. No retries permitted until 2026-02-24 10:19:09.331332508 +0000 UTC m=+174.444946789 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/17a1338b-6385-4795-9397-74316d6599d9-metrics-certs") pod "network-metrics-daemon-rpnnm" (UID: "17a1338b-6385-4795-9397-74316d6599d9") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 24 10:18:37 crc kubenswrapper[4698]: I0224 10:18:37.617503 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:18:37 crc kubenswrapper[4698]: E0224 10:18:37.617698 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:18:37 crc kubenswrapper[4698]: I0224 10:18:37.617961 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:18:37 crc kubenswrapper[4698]: E0224 10:18:37.618130 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:18:38 crc kubenswrapper[4698]: I0224 10:18:38.391630 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:38 crc kubenswrapper[4698]: I0224 10:18:38.391689 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:38 crc kubenswrapper[4698]: I0224 10:18:38.391706 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:38 crc kubenswrapper[4698]: I0224 10:18:38.391729 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:38 crc kubenswrapper[4698]: I0224 10:18:38.391748 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:38Z","lastTransitionTime":"2026-02-24T10:18:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:38 crc kubenswrapper[4698]: E0224 10:18:38.412350 4698 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b118f46-32f0-479c-9931-37b2bbb76922\\\",\\\"systemUUID\\\":\\\"b9d2441b-c8c3-476a-9c48-bba682d9b98e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:38Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:38 crc kubenswrapper[4698]: I0224 10:18:38.417897 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:38 crc kubenswrapper[4698]: I0224 10:18:38.417982 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:38 crc kubenswrapper[4698]: I0224 10:18:38.418012 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:38 crc kubenswrapper[4698]: I0224 10:18:38.418044 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:38 crc kubenswrapper[4698]: I0224 10:18:38.418066 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:38Z","lastTransitionTime":"2026-02-24T10:18:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:38 crc kubenswrapper[4698]: E0224 10:18:38.440325 4698 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b118f46-32f0-479c-9931-37b2bbb76922\\\",\\\"systemUUID\\\":\\\"b9d2441b-c8c3-476a-9c48-bba682d9b98e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:38Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:38 crc kubenswrapper[4698]: I0224 10:18:38.445964 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:38 crc kubenswrapper[4698]: I0224 10:18:38.446059 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:38 crc kubenswrapper[4698]: I0224 10:18:38.446083 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:38 crc kubenswrapper[4698]: I0224 10:18:38.446118 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:38 crc kubenswrapper[4698]: I0224 10:18:38.446142 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:38Z","lastTransitionTime":"2026-02-24T10:18:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:38 crc kubenswrapper[4698]: E0224 10:18:38.468791 4698 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b118f46-32f0-479c-9931-37b2bbb76922\\\",\\\"systemUUID\\\":\\\"b9d2441b-c8c3-476a-9c48-bba682d9b98e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:38Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:38 crc kubenswrapper[4698]: I0224 10:18:38.473950 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:38 crc kubenswrapper[4698]: I0224 10:18:38.474013 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:38 crc kubenswrapper[4698]: I0224 10:18:38.474031 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:38 crc kubenswrapper[4698]: I0224 10:18:38.474058 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:38 crc kubenswrapper[4698]: I0224 10:18:38.474075 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:38Z","lastTransitionTime":"2026-02-24T10:18:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:38 crc kubenswrapper[4698]: E0224 10:18:38.493789 4698 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b118f46-32f0-479c-9931-37b2bbb76922\\\",\\\"systemUUID\\\":\\\"b9d2441b-c8c3-476a-9c48-bba682d9b98e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:38Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:38 crc kubenswrapper[4698]: I0224 10:18:38.499430 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:38 crc kubenswrapper[4698]: I0224 10:18:38.499515 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:38 crc kubenswrapper[4698]: I0224 10:18:38.499537 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:38 crc kubenswrapper[4698]: I0224 10:18:38.499564 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:38 crc kubenswrapper[4698]: I0224 10:18:38.499586 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:38Z","lastTransitionTime":"2026-02-24T10:18:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:38 crc kubenswrapper[4698]: E0224 10:18:38.518823 4698 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b118f46-32f0-479c-9931-37b2bbb76922\\\",\\\"systemUUID\\\":\\\"b9d2441b-c8c3-476a-9c48-bba682d9b98e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:38Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:38 crc kubenswrapper[4698]: E0224 10:18:38.519113 4698 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 24 10:18:38 crc kubenswrapper[4698]: I0224 10:18:38.614438 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:18:38 crc kubenswrapper[4698]: I0224 10:18:38.614474 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpnnm" Feb 24 10:18:38 crc kubenswrapper[4698]: E0224 10:18:38.614669 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:18:38 crc kubenswrapper[4698]: E0224 10:18:38.614783 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpnnm" podUID="17a1338b-6385-4795-9397-74316d6599d9" Feb 24 10:18:39 crc kubenswrapper[4698]: I0224 10:18:39.367968 4698 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7mbk6_17dd9ce8-b1ca-4810-85fe-9775919eb4b5/kube-multus/0.log" Feb 24 10:18:39 crc kubenswrapper[4698]: I0224 10:18:39.368044 4698 generic.go:334] "Generic (PLEG): container finished" podID="17dd9ce8-b1ca-4810-85fe-9775919eb4b5" containerID="ac059400b5a17e1f1dc36d2fe35b5c8ace2dad5326f3933873eae644e1786c54" exitCode=1 Feb 24 10:18:39 crc kubenswrapper[4698]: I0224 10:18:39.368114 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7mbk6" event={"ID":"17dd9ce8-b1ca-4810-85fe-9775919eb4b5","Type":"ContainerDied","Data":"ac059400b5a17e1f1dc36d2fe35b5c8ace2dad5326f3933873eae644e1786c54"} Feb 24 10:18:39 crc kubenswrapper[4698]: I0224 10:18:39.369053 4698 scope.go:117] "RemoveContainer" containerID="ac059400b5a17e1f1dc36d2fe35b5c8ace2dad5326f3933873eae644e1786c54" Feb 24 10:18:39 crc kubenswrapper[4698]: I0224 10:18:39.395440 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:39Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:39 crc kubenswrapper[4698]: I0224 10:18:39.422444 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jlg97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90062989-bf1b-4479-89a0-f3bf0d438ac3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd966a1dd77be4accb00f38133ee9df9a0f98df5050d51996c9547a95c361cfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f570e20898252544de2e4987e3ec3baea2d46904749fc01664c969518d8babd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f570e20898252544de2e4987e3ec3baea2d46904749fc01664c969518d8babd6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86844171c4cdeecffa4831f9bba9b6d9c5eecbcc2220f880ccdb8819df60fa34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86844171c4cdeecffa4831f9bba9b6d9c5eecbcc2220f880ccdb8819df60fa34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42705a048e7832b1de855a97691620e572a7a7f38b90148e1cedd49003c649fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42705a048e7832b1de855a97691620e572a7a7f38b90148e1cedd49003c649fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5968e3b94b9d8996e9c4d4fdfab0576fcee049356dff5defd85f1a71ab652c41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5968e3b94b9d8996e9c4d4fdfab0576fcee049356dff5defd85f1a71ab652c41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05bd18aaa2469fc7380f98a513907e098a1cd45c794dae35894dc4caccaaeac8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05bd18aaa2469fc7380f98a513907e098a1cd45c794dae35894dc4caccaaeac8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c47b55214c6082bb9f8a18705983f9be95ef4c3b557d2d8f6cb8a33fa1fddd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c47b55214c6082bb9f8a18705983f9be95ef4c3b557d2d8f6cb8a33fa1fddd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jlg97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:39Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:39 crc kubenswrapper[4698]: I0224 10:18:39.446175 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"066df704-6981-4770-a647-df52a0da50a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60215d9a7dc3fbaa1b045a76c018c910f3748c5bef5325716e0a28844bc91ece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e27ae8c6aa803d58f6ff0252273d2fcbbee794c49a13fc54bfe6677b5aa6e07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7adc5b73bdd01b1e822308534c8848e154a1d05ed5367b971b59a99289387585\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2ec337c851d86c491d1ae5a667e4344ae4759f945b423d3a48838874a6eda20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://444da705b890c795bca82d2bd44ad5b71ed9bcc95a70ee5c92755679af31aa99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://096010abeb5f4fc1cf8ab2a1a3e50000365a449d0747081df923bde1be7e1213\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd1af0c59642907aa55721d60c59e0870d5597da7bdf99f8248f852ea5e393c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fd1af0c59642907aa55721d60c59e0870d5597da7bdf99f8248f852ea5e393c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-24T10:18:21Z\\\",\\\"message\\\":\\\"/multus-additional-cni-plugins-jlg97\\\\nI0224 10:18:21.771147 6785 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI0224 10:18:21.771034 6785 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0224 10:18:21.771150 6785 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-mgh7p\\\\nI0224 10:18:21.771169 6785 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nF0224 10:18:21.771174 6785 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not y\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:18:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-mgh7p_openshift-ovn-kubernetes(066df704-6981-4770-a647-df52a0da50a0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1288272246b8937c2880153451d797fc3328749902e2491e60c8f8f086c85288\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://363eade2263b2108feaaf0620f7f1fd910effb90ce635e5b749b59b407618443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://363eade2263b2108feaaf0620f7f1fd910effb90ce635e5b749b59b407618443\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mgh7p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:39Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:39 crc kubenswrapper[4698]: I0224 10:18:39.467138 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mb4d7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc3c474c-e869-4b47-94c5-f1ab3ce3c843\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d49238acba0219497644e528a1e99906b8e7e5d4a61033354fa8b7b9708b5e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d8kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mb4d7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:39Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:39 crc kubenswrapper[4698]: I0224 10:18:39.482100 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e2735c5-8b7a-424e-ba7f-8fe39da1e460\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2be309f6fc6bdf6f229b4a6ee32621f1385e3addb1c6655f4ee94a9e0f07e7e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29426bbee48683a1da8ffc61612543b337ccf61119a3617bcbbb475f75dac606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://532e8024f6ab49ca211330f56da50af0f46daf4569ad97723d35aa97076cde4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cadab6d7d113b12b60104344c27a04acf451f6627c3d62ad17b9132d63b6e974\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cadab6d7d113b12b60104344c27a04acf451f6627c3d62ad17b9132d63b6e974\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:16:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:16:16Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:16:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:39Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:39 crc kubenswrapper[4698]: I0224 10:18:39.500331 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b70223850a461f607af8055fb157db676ed4dd9537481c41f21b8b85dc955c6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:39Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:39 crc kubenswrapper[4698]: I0224 10:18:39.515614 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e70623bb6b1c9ba54ae662592cd2861cea4181853f6595a595390c81820c287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://863cee3a2b2acf3e3138d4e13d27a2b4229d619661f97eab920e5a4ee7ae2c51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:39Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:39 crc kubenswrapper[4698]: I0224 10:18:39.535549 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bhrhk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d24b42-65c5-4a01-8f4a-6f970714ab76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93b33a3866385dfb6006f052ecde4b52df1dad342d6392f0935f548b610c26e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-knwn8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://892da9f80566a48c6ace1fb4d7a16d824aad789a4ae631728a01c22a8d7b04f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-knwn8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:18:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bhrhk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:39Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:39 crc kubenswrapper[4698]: I0224 10:18:39.553601 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34fd32d5-5aed-4abb-bf14-ab1b8b02b516\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b9d9ca2f4ccd094b55e3e27cef8afddae5dc7de81912aba64ca6a6671f14a35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42a2655047e1fb057b615781d8c2ccf50f62f2a70749ef8bb214d32edaba2b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e1bb75600de7e41c8a04ba010078c753b55d05aae7a18f945c2027ba48ee30c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa3d4a95fd60ff55d1850deb923135ed607172e7676a141a5d52e6cdd60b23bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64b39341e105fbe8aa9dc4c108f6ee8a2bff33568a205e32e639b8382ab2ccb2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T10:17:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0224 10:17:08.346350 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0224 10:17:08.346447 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0224 10:17:08.346900 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705878618/tls.crt::/tmp/serving-cert-3705878618/tls.key\\\\\\\"\\\\nI0224 10:17:08.624012 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0224 10:17:08.625525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0224 10:17:08.625540 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0224 10:17:08.625560 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0224 10:17:08.625565 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0224 10:17:08.629654 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0224 10:17:08.629666 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0224 10:17:08.629711 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:17:08.629725 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:17:08.629739 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0224 10:17:08.629749 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0224 10:17:08.629758 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0224 10:17:08.629766 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0224 10:17:08.630467 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://674ed085a7507742c61fdb7dae4678b08e315a3679788c5dcbb4df97cdc27c61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9e1b116db9c76dec99d1ac4af98e5ee081f2a171a19093ba5628b676356f34b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9e1b116db9c76dec99d1ac4af98e5ee081f2a171a19093ba5628b676356f34b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:16:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:16:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:16:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:39Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:39 crc kubenswrapper[4698]: I0224 10:18:39.567568 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-29rvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9cba56db-d46e-4a34-9863-47e4dce27ca5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f62a06c2933f02c75637172be87adadd015a2aad2750f553bb2e99c38fbec74b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fk9xv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-29rvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:39Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:39 crc kubenswrapper[4698]: I0224 10:18:39.582494 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff15454f-f3f9-4740-ba7f-141fc467f2bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23d201f106cf9fbd3bd2821755ea1fd87709b24155eebfab4f687defd0fd60bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://686acda68f64175c520efc4054df6bcfd32b2c98a3d8134d32e252d265520338\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T10:16:46Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0224 10:16:17.940332 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0224 10:16:17.942929 1 observer_polling.go:159] Starting file observer\\\\nI0224 10:16:18.009451 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0224 10:16:18.013067 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0224 10:16:46.691402 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0224 10:16:46.691560 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:16:17Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0a9191217045254bf454800fc32d325cc4450d0d4d0d9b6fb4bd6a438872cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9ee3d8391b55fa37cff72ad555ec89f4b12b8b5ef765979d929da0ae7cbb052\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b45bc6035a33d5e9841bd5791aeb2521dd1f93616396be15bef77dc6f5af97cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:16:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:39Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:39 crc kubenswrapper[4698]: I0224 10:18:39.599213 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:39Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:39 crc kubenswrapper[4698]: I0224 10:18:39.614037 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:18:39 crc kubenswrapper[4698]: I0224 10:18:39.614105 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:18:39 crc kubenswrapper[4698]: E0224 10:18:39.614182 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:18:39 crc kubenswrapper[4698]: E0224 10:18:39.614240 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:18:39 crc kubenswrapper[4698]: I0224 10:18:39.621292 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7mbk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17dd9ce8-b1ca-4810-85fe-9775919eb4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac059400b5a17e1f1dc36d2fe35b5c8ace2dad5326f3933873eae644e1786c54\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac059400b5a17e1f1dc36d2fe35b5c8ace2dad5326f3933873eae644e1786c54\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-24T10:18:39Z\\\",\\\"message\\\":\\\"2026-02-24T10:17:53+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_04a40885-9d91-49e5-a993-91473cb3b04d\\\\n2026-02-24T10:17:53+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_04a40885-9d91-49e5-a993-91473cb3b04d to /host/opt/cni/bin/\\\\n2026-02-24T10:17:54Z [verbose] multus-daemon started\\\\n2026-02-24T10:17:54Z [verbose] Readiness Indicator file check\\\\n2026-02-24T10:18:39Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgnjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7mbk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:39Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:39 crc kubenswrapper[4698]: I0224 10:18:39.634488 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rpnnm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17a1338b-6385-4795-9397-74316d6599d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7xll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7xll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:18:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rpnnm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:39Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:39 crc kubenswrapper[4698]: I0224 10:18:39.650874 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:39Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:39 crc kubenswrapper[4698]: I0224 10:18:39.664258 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4539d49e9935099b59be97e672ffbe6a2a831b9261939a5afba45e16aab5c2dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:39Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:39 crc kubenswrapper[4698]: I0224 10:18:39.673563 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-nn578" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4ee0bb1-125d-4852-a54d-7dadf6177545\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67e08c23594b195088f0a11823556880d9f809097ec231acf6c4ddbcf5c085b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9ngd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0c8bc2bc5ebfb2472863808bf33f95f8aa74ed45b546ed1a1b3be4883af700e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9ngd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-nn578\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:39Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:40 crc kubenswrapper[4698]: I0224 10:18:40.372646 4698 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7mbk6_17dd9ce8-b1ca-4810-85fe-9775919eb4b5/kube-multus/0.log" Feb 24 10:18:40 crc kubenswrapper[4698]: I0224 10:18:40.372711 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7mbk6" event={"ID":"17dd9ce8-b1ca-4810-85fe-9775919eb4b5","Type":"ContainerStarted","Data":"26cc85a7a79119a1df0de0f47a3098d7417118ce0da5b300f453a3d8c4f351a7"} Feb 24 10:18:40 crc kubenswrapper[4698]: I0224 10:18:40.390196 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff15454f-f3f9-4740-ba7f-141fc467f2bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23d201f106cf9fbd3bd2821755ea1fd87709b24155eebfab4f687defd0fd60bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://686acda68f64175c520efc4054df6bcfd32b2c98a3d8134d32e252d265520338\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T10:16:46Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0224 10:16:17.940332 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0224 10:16:17.942929 1 observer_polling.go:159] Starting file observer\\\\nI0224 10:16:18.009451 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0224 10:16:18.013067 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0224 10:16:46.691402 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0224 10:16:46.691560 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:16:17Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0a9191217045254bf454800fc32d325cc4450d0d4d0d9b6fb4bd6a438872cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9ee3d8391b55fa37cff72ad555ec89f4b12b8b5ef765979d929da0ae7cbb052\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b45bc6035a33d5e9841bd5791aeb2521dd1f93616396be15bef77dc6f5af97cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:16:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:40Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:40 crc kubenswrapper[4698]: I0224 10:18:40.410234 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:40Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:40 crc kubenswrapper[4698]: I0224 10:18:40.432435 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4539d49e9935099b59be97e672ffbe6a2a831b9261939a5afba45e16aab5c2dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:40Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:40 crc kubenswrapper[4698]: I0224 10:18:40.449378 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-nn578" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4ee0bb1-125d-4852-a54d-7dadf6177545\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67e08c23594b195088f0a11823556880d9f809097ec231acf6c4ddbcf5c085b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9ngd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0c8bc2bc5ebfb2472863808bf33f95f8aa74ed45b546ed1a1b3be4883af700e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9ngd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-nn578\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:40Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:40 crc kubenswrapper[4698]: I0224 10:18:40.470006 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7mbk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17dd9ce8-b1ca-4810-85fe-9775919eb4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26cc85a7a79119a1df0de0f47a3098d7417118ce0da5b300f453a3d8c4f351a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac059400b5a17e1f1dc36d2fe35b5c8ace2dad5326f3933873eae644e1786c54\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-24T10:18:39Z\\\",\\\"message\\\":\\\"2026-02-24T10:17:53+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_04a40885-9d91-49e5-a993-91473cb3b04d\\\\n2026-02-24T10:17:53+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_04a40885-9d91-49e5-a993-91473cb3b04d to /host/opt/cni/bin/\\\\n2026-02-24T10:17:54Z [verbose] multus-daemon started\\\\n2026-02-24T10:17:54Z [verbose] Readiness Indicator file check\\\\n2026-02-24T10:18:39Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:18:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgnjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7mbk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:40Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:40 crc kubenswrapper[4698]: I0224 10:18:40.484642 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rpnnm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17a1338b-6385-4795-9397-74316d6599d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7xll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7xll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:18:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rpnnm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:40Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:40 crc kubenswrapper[4698]: I0224 10:18:40.502942 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:40Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:40 crc kubenswrapper[4698]: I0224 10:18:40.518920 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b70223850a461f607af8055fb157db676ed4dd9537481c41f21b8b85dc955c6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:40Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:40 crc kubenswrapper[4698]: I0224 10:18:40.535954 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e70623bb6b1c9ba54ae662592cd2861cea4181853f6595a595390c81820c287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://863cee3a2b2acf3e3138d4e13d27a2b4229d619661f97eab920e5a4ee7ae2c51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:40Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:40 crc kubenswrapper[4698]: I0224 10:18:40.555615 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:40Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:40 crc kubenswrapper[4698]: I0224 10:18:40.573773 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jlg97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90062989-bf1b-4479-89a0-f3bf0d438ac3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd966a1dd77be4accb00f38133ee9df9a0f98df5050d51996c9547a95c361cfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f570e20898252544de2e4987e3ec3baea2d46904749fc01664c969518d8babd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f570e20898252544de2e4987e3ec3baea2d46904749fc01664c969518d8babd6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86844171c4cdeecffa4831f9bba9b6d9c5eecbcc2220f880ccdb8819df60fa34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86844171c4cdeecffa4831f9bba9b6d9c5eecbcc2220f880ccdb8819df60fa34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42705a048e7832b1de855a97691620e572a7a7f38b90148e1cedd49003c649fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42705a048e7832b1de855a97691620e572a7a7f38b90148e1cedd49003c649fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5968e3b94b9d8996e9c4d4fdfab0576fcee049356dff5defd85f1a71ab652c41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5968e3b94b9d8996e9c4d4fdfab0576fcee049356dff5defd85f1a71ab652c41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05bd18aaa2469fc7380f98a513907e098a1cd45c794dae35894dc4caccaaeac8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05bd18aaa2469fc7380f98a513907e098a1cd45c794dae35894dc4caccaaeac8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c47b55214c6082bb9f8a18705983f9be95ef4c3b557d2d8f6cb8a33fa1fddd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c47b55214c6082bb9f8a18705983f9be95ef4c3b557d2d8f6cb8a33fa1fddd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jlg97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:40Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:40 crc kubenswrapper[4698]: I0224 10:18:40.603741 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"066df704-6981-4770-a647-df52a0da50a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60215d9a7dc3fbaa1b045a76c018c910f3748c5bef5325716e0a28844bc91ece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e27ae8c6aa803d58f6ff0252273d2fcbbee794c49a13fc54bfe6677b5aa6e07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7adc5b73bdd01b1e822308534c8848e154a1d05ed5367b971b59a99289387585\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2ec337c851d86c491d1ae5a667e4344ae4759f945b423d3a48838874a6eda20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://444da705b890c795bca82d2bd44ad5b71ed9bcc95a70ee5c92755679af31aa99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://096010abeb5f4fc1cf8ab2a1a3e50000365a449d0747081df923bde1be7e1213\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd1af0c59642907aa55721d60c59e0870d5597da7bdf99f8248f852ea5e393c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fd1af0c59642907aa55721d60c59e0870d5597da7bdf99f8248f852ea5e393c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-24T10:18:21Z\\\",\\\"message\\\":\\\"/multus-additional-cni-plugins-jlg97\\\\nI0224 10:18:21.771147 6785 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI0224 10:18:21.771034 6785 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0224 10:18:21.771150 6785 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-mgh7p\\\\nI0224 10:18:21.771169 6785 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nF0224 10:18:21.771174 6785 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not y\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:18:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-mgh7p_openshift-ovn-kubernetes(066df704-6981-4770-a647-df52a0da50a0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1288272246b8937c2880153451d797fc3328749902e2491e60c8f8f086c85288\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://363eade2263b2108feaaf0620f7f1fd910effb90ce635e5b749b59b407618443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://363eade2263b2108feaaf0620f7f1fd910effb90ce635e5b749b59b407618443\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mgh7p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:40Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:40 crc kubenswrapper[4698]: I0224 10:18:40.614621 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpnnm" Feb 24 10:18:40 crc kubenswrapper[4698]: I0224 10:18:40.614636 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:18:40 crc kubenswrapper[4698]: E0224 10:18:40.614835 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpnnm" podUID="17a1338b-6385-4795-9397-74316d6599d9" Feb 24 10:18:40 crc kubenswrapper[4698]: E0224 10:18:40.615005 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:18:40 crc kubenswrapper[4698]: I0224 10:18:40.618995 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mb4d7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc3c474c-e869-4b47-94c5-f1ab3ce3c843\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d49238acba0219497644e528a1e99906b8e7e5d4a61033354fa8b7b9708b5e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d8kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mb4d7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:40Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:40 crc kubenswrapper[4698]: I0224 10:18:40.636673 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e2735c5-8b7a-424e-ba7f-8fe39da1e460\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2be309f6fc6bdf6f229b4a6ee32621f1385e3addb1c6655f4ee94a9e0f07e7e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29426bbee48683a1da8ffc61612543b337ccf61119a3617bcbbb475f75dac606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://532e8024f6ab49ca211330f56da50af0f46daf4569ad97723d35aa97076cde4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cadab6d7d113b12b60104344c27a04acf451f6627c3d62ad17b9132d63b6e974\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cadab6d7d113b12b60104344c27a04acf451f6627c3d62ad17b9132d63b6e974\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:16:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:16:16Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:16:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:40Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:40 crc kubenswrapper[4698]: I0224 10:18:40.654133 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bhrhk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d24b42-65c5-4a01-8f4a-6f970714ab76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93b33a3866385dfb6006f052ecde4b52df1dad342d6392f0935f548b610c26e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-knwn8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://892da9f80566a48c6ace1fb4d7a16d824aad789a4ae631728a01c22a8d7b04f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-knwn8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:18:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bhrhk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:40Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:40 crc kubenswrapper[4698]: I0224 10:18:40.675454 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34fd32d5-5aed-4abb-bf14-ab1b8b02b516\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b9d9ca2f4ccd094b55e3e27cef8afddae5dc7de81912aba64ca6a6671f14a35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42a2655047e1fb057b615781d8c2ccf50f62f2a70749ef8bb214d32edaba2b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e1bb75600de7e41c8a04ba010078c753b55d05aae7a18f945c2027ba48ee30c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa3d4a95fd60ff55d1850deb923135ed607172e7676a141a5d52e6cdd60b23bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64b39341e105fbe8aa9dc4c108f6ee8a2bff33568a205e32e639b8382ab2ccb2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T10:17:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0224 10:17:08.346350 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0224 10:17:08.346447 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0224 10:17:08.346900 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705878618/tls.crt::/tmp/serving-cert-3705878618/tls.key\\\\\\\"\\\\nI0224 10:17:08.624012 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0224 10:17:08.625525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0224 10:17:08.625540 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0224 10:17:08.625560 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0224 10:17:08.625565 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0224 10:17:08.629654 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0224 10:17:08.629666 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0224 10:17:08.629711 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:17:08.629725 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:17:08.629739 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0224 10:17:08.629749 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0224 10:17:08.629758 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0224 10:17:08.629766 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0224 10:17:08.630467 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://674ed085a7507742c61fdb7dae4678b08e315a3679788c5dcbb4df97cdc27c61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9e1b116db9c76dec99d1ac4af98e5ee081f2a171a19093ba5628b676356f34b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9e1b116db9c76dec99d1ac4af98e5ee081f2a171a19093ba5628b676356f34b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:16:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:16:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:16:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:40Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:40 crc kubenswrapper[4698]: I0224 10:18:40.690251 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-29rvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9cba56db-d46e-4a34-9863-47e4dce27ca5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f62a06c2933f02c75637172be87adadd015a2aad2750f553bb2e99c38fbec74b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fk9xv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-29rvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:40Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:40 crc kubenswrapper[4698]: E0224 10:18:40.742644 4698 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 24 10:18:41 crc kubenswrapper[4698]: I0224 10:18:41.614566 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:18:41 crc kubenswrapper[4698]: I0224 10:18:41.614690 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:18:41 crc kubenswrapper[4698]: E0224 10:18:41.614763 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:18:41 crc kubenswrapper[4698]: E0224 10:18:41.614864 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:18:42 crc kubenswrapper[4698]: I0224 10:18:42.614654 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:18:42 crc kubenswrapper[4698]: I0224 10:18:42.614681 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpnnm" Feb 24 10:18:42 crc kubenswrapper[4698]: E0224 10:18:42.614839 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:18:42 crc kubenswrapper[4698]: E0224 10:18:42.614987 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpnnm" podUID="17a1338b-6385-4795-9397-74316d6599d9" Feb 24 10:18:43 crc kubenswrapper[4698]: I0224 10:18:43.614474 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:18:43 crc kubenswrapper[4698]: I0224 10:18:43.614558 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:18:43 crc kubenswrapper[4698]: E0224 10:18:43.614666 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:18:43 crc kubenswrapper[4698]: E0224 10:18:43.614812 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:18:44 crc kubenswrapper[4698]: I0224 10:18:44.614357 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:18:44 crc kubenswrapper[4698]: I0224 10:18:44.614362 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpnnm" Feb 24 10:18:44 crc kubenswrapper[4698]: E0224 10:18:44.614552 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:18:44 crc kubenswrapper[4698]: E0224 10:18:44.614672 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpnnm" podUID="17a1338b-6385-4795-9397-74316d6599d9" Feb 24 10:18:45 crc kubenswrapper[4698]: I0224 10:18:45.614343 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:18:45 crc kubenswrapper[4698]: E0224 10:18:45.614562 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:18:45 crc kubenswrapper[4698]: I0224 10:18:45.614653 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:18:45 crc kubenswrapper[4698]: E0224 10:18:45.614905 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:18:45 crc kubenswrapper[4698]: I0224 10:18:45.637961 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff15454f-f3f9-4740-ba7f-141fc467f2bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23d201f106cf9fbd3bd2821755ea1fd87709b24155eebfab4f687defd0fd60bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://686acda68f64175c520efc4054df6bcfd32b2c98a3d8134d32e252d265520338\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T10:16:46Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0224 10:16:17.940332 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0224 10:16:17.942929 1 observer_polling.go:159] Starting file observer\\\\nI0224 10:16:18.009451 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0224 10:16:18.013067 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0224 10:16:46.691402 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0224 10:16:46.691560 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:16:17Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0a9191217045254bf454800fc32d325cc4450d0d4d0d9b6fb4bd6a438872cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9ee3d8391b55fa37cff72ad555ec89f4b12b8b5ef765979d929da0ae7cbb052\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b45bc6035a33d5e9841bd5791aeb2521dd1f93616396be15bef77dc6f5af97cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:16:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:45Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:45 crc kubenswrapper[4698]: I0224 10:18:45.657608 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:45Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:45 crc kubenswrapper[4698]: I0224 10:18:45.678545 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:45Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:45 crc kubenswrapper[4698]: I0224 10:18:45.702452 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4539d49e9935099b59be97e672ffbe6a2a831b9261939a5afba45e16aab5c2dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:45Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:45 crc kubenswrapper[4698]: I0224 10:18:45.723767 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-nn578" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4ee0bb1-125d-4852-a54d-7dadf6177545\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67e08c23594b195088f0a11823556880d9f809097ec231acf6c4ddbcf5c085b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9ngd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0c8bc2bc5ebfb2472863808bf33f95f8aa74ed45b546ed1a1b3be4883af700e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9ngd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-nn578\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:45Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:45 crc kubenswrapper[4698]: E0224 10:18:45.743535 4698 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 24 10:18:45 crc kubenswrapper[4698]: I0224 10:18:45.743589 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7mbk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17dd9ce8-b1ca-4810-85fe-9775919eb4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26cc85a7a79119a1df0de0f47a3098d7417118ce0da5b300f453a3d8c4f351a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac059400b5a17e1f1dc36d2fe35b5c8ace2dad5326f3933873eae644e1786c54\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-24T10:18:39Z\\\",\\\"message\\\":\\\"2026-02-24T10:17:53+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_04a40885-9d91-49e5-a993-91473cb3b04d\\\\n2026-02-24T10:17:53+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_04a40885-9d91-49e5-a993-91473cb3b04d to /host/opt/cni/bin/\\\\n2026-02-24T10:17:54Z [verbose] multus-daemon started\\\\n2026-02-24T10:17:54Z [verbose] Readiness Indicator file check\\\\n2026-02-24T10:18:39Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:18:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgnjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7mbk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:45Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:45 crc kubenswrapper[4698]: I0224 10:18:45.765934 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rpnnm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17a1338b-6385-4795-9397-74316d6599d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7xll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7xll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:18:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rpnnm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:45Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:45 crc kubenswrapper[4698]: I0224 10:18:45.786243 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e2735c5-8b7a-424e-ba7f-8fe39da1e460\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2be309f6fc6bdf6f229b4a6ee32621f1385e3addb1c6655f4ee94a9e0f07e7e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29426bbee48683a1da8ffc61612543b337ccf61119a3617bcbbb475f75dac606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://532e8024f6ab49ca211330f56da50af0f46daf4569ad97723d35aa97076cde4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cadab6d7d113b12b60104344c27a04acf451f6627c3d62ad17b9132d63b6e974\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cadab6d7d113b12b60104344c27a04acf451f6627c3d62ad17b9132d63b6e974\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:16:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:16:16Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:16:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:45Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:45 crc kubenswrapper[4698]: I0224 10:18:45.801560 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b70223850a461f607af8055fb157db676ed4dd9537481c41f21b8b85dc955c6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:45Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:45 crc kubenswrapper[4698]: I0224 10:18:45.818965 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e70623bb6b1c9ba54ae662592cd2861cea4181853f6595a595390c81820c287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://863cee3a2b2acf3e3138d4e13d27a2b4229d619661f97eab920e5a4ee7ae2c51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:45Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:45 crc kubenswrapper[4698]: I0224 10:18:45.830869 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:45Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:45 crc kubenswrapper[4698]: I0224 10:18:45.847780 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jlg97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90062989-bf1b-4479-89a0-f3bf0d438ac3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd966a1dd77be4accb00f38133ee9df9a0f98df5050d51996c9547a95c361cfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f570e20898252544de2e4987e3ec3baea2d46904749fc01664c969518d8babd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f570e20898252544de2e4987e3ec3baea2d46904749fc01664c969518d8babd6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86844171c4cdeecffa4831f9bba9b6d9c5eecbcc2220f880ccdb8819df60fa34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86844171c4cdeecffa4831f9bba9b6d9c5eecbcc2220f880ccdb8819df60fa34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42705a048e7832b1de855a97691620e572a7a7f38b90148e1cedd49003c649fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42705a048e7832b1de855a97691620e572a7a7f38b90148e1cedd49003c649fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5968e3b94b9d8996e9c4d4fdfab0576fcee049356dff5defd85f1a71ab652c41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5968e3b94b9d8996e9c4d4fdfab0576fcee049356dff5defd85f1a71ab652c41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05bd18aaa2469fc7380f98a513907e098a1cd45c794dae35894dc4caccaaeac8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05bd18aaa2469fc7380f98a513907e098a1cd45c794dae35894dc4caccaaeac8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c47b55214c6082bb9f8a18705983f9be95ef4c3b557d2d8f6cb8a33fa1fddd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c47b55214c6082bb9f8a18705983f9be95ef4c3b557d2d8f6cb8a33fa1fddd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jlg97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:45Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:45 crc kubenswrapper[4698]: I0224 10:18:45.868018 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"066df704-6981-4770-a647-df52a0da50a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60215d9a7dc3fbaa1b045a76c018c910f3748c5bef5325716e0a28844bc91ece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e27ae8c6aa803d58f6ff0252273d2fcbbee794c49a13fc54bfe6677b5aa6e07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7adc5b73bdd01b1e822308534c8848e154a1d05ed5367b971b59a99289387585\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2ec337c851d86c491d1ae5a667e4344ae4759f945b423d3a48838874a6eda20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://444da705b890c795bca82d2bd44ad5b71ed9bcc95a70ee5c92755679af31aa99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://096010abeb5f4fc1cf8ab2a1a3e50000365a449d0747081df923bde1be7e1213\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0fd1af0c59642907aa55721d60c59e0870d5597da7bdf99f8248f852ea5e393c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fd1af0c59642907aa55721d60c59e0870d5597da7bdf99f8248f852ea5e393c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-24T10:18:21Z\\\",\\\"message\\\":\\\"/multus-additional-cni-plugins-jlg97\\\\nI0224 10:18:21.771147 6785 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI0224 10:18:21.771034 6785 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0224 10:18:21.771150 6785 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-mgh7p\\\\nI0224 10:18:21.771169 6785 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nF0224 10:18:21.771174 6785 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not y\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:18:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-mgh7p_openshift-ovn-kubernetes(066df704-6981-4770-a647-df52a0da50a0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1288272246b8937c2880153451d797fc3328749902e2491e60c8f8f086c85288\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://363eade2263b2108feaaf0620f7f1fd910effb90ce635e5b749b59b407618443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://363eade2263b2108feaaf0620f7f1fd910effb90ce635e5b749b59b407618443\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mgh7p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:45Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:45 crc kubenswrapper[4698]: I0224 10:18:45.881048 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mb4d7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc3c474c-e869-4b47-94c5-f1ab3ce3c843\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d49238acba0219497644e528a1e99906b8e7e5d4a61033354fa8b7b9708b5e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d8kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mb4d7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:45Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:45 crc kubenswrapper[4698]: I0224 10:18:45.894693 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bhrhk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d24b42-65c5-4a01-8f4a-6f970714ab76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93b33a3866385dfb6006f052ecde4b52df1dad342d6392f0935f548b610c26e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-knwn8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://892da9f80566a48c6ace1fb4d7a16d824aad789a4ae631728a01c22a8d7b04f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-knwn8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:18:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bhrhk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:45Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:45 crc kubenswrapper[4698]: I0224 10:18:45.911575 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34fd32d5-5aed-4abb-bf14-ab1b8b02b516\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b9d9ca2f4ccd094b55e3e27cef8afddae5dc7de81912aba64ca6a6671f14a35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42a2655047e1fb057b615781d8c2ccf50f62f2a70749ef8bb214d32edaba2b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e1bb75600de7e41c8a04ba010078c753b55d05aae7a18f945c2027ba48ee30c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa3d4a95fd60ff55d1850deb923135ed607172e7676a141a5d52e6cdd60b23bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64b39341e105fbe8aa9dc4c108f6ee8a2bff33568a205e32e639b8382ab2ccb2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T10:17:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0224 10:17:08.346350 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0224 10:17:08.346447 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0224 10:17:08.346900 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705878618/tls.crt::/tmp/serving-cert-3705878618/tls.key\\\\\\\"\\\\nI0224 10:17:08.624012 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0224 10:17:08.625525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0224 10:17:08.625540 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0224 10:17:08.625560 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0224 10:17:08.625565 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0224 10:17:08.629654 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0224 10:17:08.629666 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0224 10:17:08.629711 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:17:08.629725 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:17:08.629739 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0224 10:17:08.629749 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0224 10:17:08.629758 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0224 10:17:08.629766 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0224 10:17:08.630467 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://674ed085a7507742c61fdb7dae4678b08e315a3679788c5dcbb4df97cdc27c61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9e1b116db9c76dec99d1ac4af98e5ee081f2a171a19093ba5628b676356f34b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9e1b116db9c76dec99d1ac4af98e5ee081f2a171a19093ba5628b676356f34b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:16:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:16:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:16:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:45Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:45 crc kubenswrapper[4698]: I0224 10:18:45.926405 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-29rvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9cba56db-d46e-4a34-9863-47e4dce27ca5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f62a06c2933f02c75637172be87adadd015a2aad2750f553bb2e99c38fbec74b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fk9xv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-29rvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:45Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:46 crc kubenswrapper[4698]: I0224 10:18:46.613719 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:18:46 crc kubenswrapper[4698]: I0224 10:18:46.613735 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpnnm" Feb 24 10:18:46 crc kubenswrapper[4698]: E0224 10:18:46.613983 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:18:46 crc kubenswrapper[4698]: E0224 10:18:46.614140 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpnnm" podUID="17a1338b-6385-4795-9397-74316d6599d9" Feb 24 10:18:47 crc kubenswrapper[4698]: I0224 10:18:47.613852 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:18:47 crc kubenswrapper[4698]: I0224 10:18:47.613855 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:18:47 crc kubenswrapper[4698]: E0224 10:18:47.614648 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:18:47 crc kubenswrapper[4698]: E0224 10:18:47.614948 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:18:48 crc kubenswrapper[4698]: I0224 10:18:48.610857 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:48 crc kubenswrapper[4698]: I0224 10:18:48.610927 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:48 crc kubenswrapper[4698]: I0224 10:18:48.610946 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:48 crc kubenswrapper[4698]: I0224 10:18:48.610973 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:48 crc kubenswrapper[4698]: I0224 10:18:48.610992 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:48Z","lastTransitionTime":"2026-02-24T10:18:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:48 crc kubenswrapper[4698]: I0224 10:18:48.613706 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpnnm" Feb 24 10:18:48 crc kubenswrapper[4698]: E0224 10:18:48.613891 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpnnm" podUID="17a1338b-6385-4795-9397-74316d6599d9" Feb 24 10:18:48 crc kubenswrapper[4698]: I0224 10:18:48.614056 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:18:48 crc kubenswrapper[4698]: E0224 10:18:48.614182 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:18:48 crc kubenswrapper[4698]: I0224 10:18:48.615997 4698 scope.go:117] "RemoveContainer" containerID="0fd1af0c59642907aa55721d60c59e0870d5597da7bdf99f8248f852ea5e393c" Feb 24 10:18:48 crc kubenswrapper[4698]: E0224 10:18:48.635143 4698 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b118f46-32f0-479c-9931-37b2bbb76922\\\",\\\"systemUUID\\\":\\\"b9d2441b-c8c3-476a-9c48-bba682d9b98e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:48Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:48 crc kubenswrapper[4698]: I0224 10:18:48.642541 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:48 crc kubenswrapper[4698]: I0224 10:18:48.642607 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:48 crc kubenswrapper[4698]: I0224 10:18:48.642625 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:48 crc kubenswrapper[4698]: I0224 10:18:48.642654 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:48 crc kubenswrapper[4698]: I0224 10:18:48.642674 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:48Z","lastTransitionTime":"2026-02-24T10:18:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:48 crc kubenswrapper[4698]: I0224 10:18:48.646651 4698 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Feb 24 10:18:48 crc kubenswrapper[4698]: E0224 10:18:48.666397 4698 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b118f46-32f0-479c-9931-37b2bbb76922\\\",\\\"systemUUID\\\":\\\"b9d2441b-c8c3-476a-9c48-bba682d9b98e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:48Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:48 crc kubenswrapper[4698]: I0224 10:18:48.673408 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:48 crc kubenswrapper[4698]: I0224 10:18:48.673472 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:48 crc kubenswrapper[4698]: I0224 10:18:48.673499 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:48 crc kubenswrapper[4698]: I0224 10:18:48.673528 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:48 crc kubenswrapper[4698]: I0224 10:18:48.673550 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:48Z","lastTransitionTime":"2026-02-24T10:18:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:48 crc kubenswrapper[4698]: E0224 10:18:48.693919 4698 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b118f46-32f0-479c-9931-37b2bbb76922\\\",\\\"systemUUID\\\":\\\"b9d2441b-c8c3-476a-9c48-bba682d9b98e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:48Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:48 crc kubenswrapper[4698]: I0224 10:18:48.700656 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:48 crc kubenswrapper[4698]: I0224 10:18:48.700749 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:48 crc kubenswrapper[4698]: I0224 10:18:48.700767 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:48 crc kubenswrapper[4698]: I0224 10:18:48.700791 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:48 crc kubenswrapper[4698]: I0224 10:18:48.700807 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:48Z","lastTransitionTime":"2026-02-24T10:18:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:48 crc kubenswrapper[4698]: E0224 10:18:48.723482 4698 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b118f46-32f0-479c-9931-37b2bbb76922\\\",\\\"systemUUID\\\":\\\"b9d2441b-c8c3-476a-9c48-bba682d9b98e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:48Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:48 crc kubenswrapper[4698]: I0224 10:18:48.730020 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:48 crc kubenswrapper[4698]: I0224 10:18:48.730079 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:48 crc kubenswrapper[4698]: I0224 10:18:48.730097 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:48 crc kubenswrapper[4698]: I0224 10:18:48.730153 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:48 crc kubenswrapper[4698]: I0224 10:18:48.730171 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:48Z","lastTransitionTime":"2026-02-24T10:18:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:48 crc kubenswrapper[4698]: E0224 10:18:48.753539 4698 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b118f46-32f0-479c-9931-37b2bbb76922\\\",\\\"systemUUID\\\":\\\"b9d2441b-c8c3-476a-9c48-bba682d9b98e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:48Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:48 crc kubenswrapper[4698]: E0224 10:18:48.753932 4698 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 24 10:18:49 crc kubenswrapper[4698]: I0224 10:18:49.408377 4698 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mgh7p_066df704-6981-4770-a647-df52a0da50a0/ovnkube-controller/2.log" Feb 24 10:18:49 crc kubenswrapper[4698]: I0224 10:18:49.411853 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" event={"ID":"066df704-6981-4770-a647-df52a0da50a0","Type":"ContainerStarted","Data":"b4f0637ffd869edc84aea294e257ec525bede2fdb6f95377ebe6bf3fb1033d71"} Feb 24 10:18:49 crc kubenswrapper[4698]: I0224 10:18:49.428134 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff15454f-f3f9-4740-ba7f-141fc467f2bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23d201f106cf9fbd3bd2821755ea1fd87709b24155eebfab4f687defd0fd60bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://686acda68f64175c520efc4054df6bcfd32b2c98a3d8134d32e252d265520338\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T10:16:46Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0224 10:16:17.940332 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0224 10:16:17.942929 1 observer_polling.go:159] Starting file observer\\\\nI0224 10:16:18.009451 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0224 10:16:18.013067 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0224 10:16:46.691402 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0224 10:16:46.691560 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:16:17Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0a9191217045254bf454800fc32d325cc4450d0d4d0d9b6fb4bd6a438872cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9ee3d8391b55fa37cff72ad555ec89f4b12b8b5ef765979d929da0ae7cbb052\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b45bc6035a33d5e9841bd5791aeb2521dd1f93616396be15bef77dc6f5af97cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:16:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:49Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:49 crc kubenswrapper[4698]: I0224 10:18:49.440418 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:49Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:49 crc kubenswrapper[4698]: I0224 10:18:49.454360 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4539d49e9935099b59be97e672ffbe6a2a831b9261939a5afba45e16aab5c2dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:49Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:49 crc kubenswrapper[4698]: I0224 10:18:49.467429 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-nn578" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4ee0bb1-125d-4852-a54d-7dadf6177545\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67e08c23594b195088f0a11823556880d9f809097ec231acf6c4ddbcf5c085b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9ngd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0c8bc2bc5ebfb2472863808bf33f95f8aa74ed45b546ed1a1b3be4883af700e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9ngd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-nn578\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:49Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:49 crc kubenswrapper[4698]: I0224 10:18:49.488627 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7mbk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17dd9ce8-b1ca-4810-85fe-9775919eb4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26cc85a7a79119a1df0de0f47a3098d7417118ce0da5b300f453a3d8c4f351a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac059400b5a17e1f1dc36d2fe35b5c8ace2dad5326f3933873eae644e1786c54\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-24T10:18:39Z\\\",\\\"message\\\":\\\"2026-02-24T10:17:53+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_04a40885-9d91-49e5-a993-91473cb3b04d\\\\n2026-02-24T10:17:53+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_04a40885-9d91-49e5-a993-91473cb3b04d to /host/opt/cni/bin/\\\\n2026-02-24T10:17:54Z [verbose] multus-daemon started\\\\n2026-02-24T10:17:54Z [verbose] Readiness Indicator file check\\\\n2026-02-24T10:18:39Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:18:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgnjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7mbk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:49Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:49 crc kubenswrapper[4698]: I0224 10:18:49.505742 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rpnnm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17a1338b-6385-4795-9397-74316d6599d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7xll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7xll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:18:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rpnnm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:49Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:49 crc kubenswrapper[4698]: I0224 10:18:49.530795 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9c80bb7-e413-45b4-8845-2b54d65b6529\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ce842ec12984cffb63c49d9c2964440e503b1225036922d25e238b978b26130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a06990c16f9a0312f24771d4bfbbedeebbf5063afb8daaccfc4d17f60d641f5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1349d6c8aff311d876b61e13793a708952def1ba52ba669fcf8a99b27ba7db5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c63acc4dd56ca511d6de2f69a1f60dc53516cf4883c0355e1de373ae7fe0807f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69d7559f437e1a17b2ab3498c72ef428df69dfcc6827f78dd1edbc4a8251b5f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96955e70c81698ab59580428c999d2bc6a50b712c569961169488e58f1702878\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96955e70c81698ab59580428c999d2bc6a50b712c569961169488e58f1702878\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:16:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:16:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ac6f358ab1233bc8d572a403f57ad949ec6e10df4c56b9d4b535362a0f639e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ac6f358ab1233bc8d572a403f57ad949ec6e10df4c56b9d4b535362a0f639e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:16:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:16:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6cba8547d85b5fb437d90d455b42654df8d8663b592dee40a0982427c2f98547\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cba8547d85b5fb437d90d455b42654df8d8663b592dee40a0982427c2f98547\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:16:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:16:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:49Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:49 crc kubenswrapper[4698]: I0224 10:18:49.549081 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:49Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:49 crc kubenswrapper[4698]: I0224 10:18:49.558709 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b70223850a461f607af8055fb157db676ed4dd9537481c41f21b8b85dc955c6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:49Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:49 crc kubenswrapper[4698]: I0224 10:18:49.572516 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e70623bb6b1c9ba54ae662592cd2861cea4181853f6595a595390c81820c287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://863cee3a2b2acf3e3138d4e13d27a2b4229d619661f97eab920e5a4ee7ae2c51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:49Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:49 crc kubenswrapper[4698]: I0224 10:18:49.584532 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:49Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:49 crc kubenswrapper[4698]: I0224 10:18:49.602638 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jlg97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90062989-bf1b-4479-89a0-f3bf0d438ac3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd966a1dd77be4accb00f38133ee9df9a0f98df5050d51996c9547a95c361cfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f570e20898252544de2e4987e3ec3baea2d46904749fc01664c969518d8babd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f570e20898252544de2e4987e3ec3baea2d46904749fc01664c969518d8babd6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86844171c4cdeecffa4831f9bba9b6d9c5eecbcc2220f880ccdb8819df60fa34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86844171c4cdeecffa4831f9bba9b6d9c5eecbcc2220f880ccdb8819df60fa34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42705a048e7832b1de855a97691620e572a7a7f38b90148e1cedd49003c649fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42705a048e7832b1de855a97691620e572a7a7f38b90148e1cedd49003c649fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5968e3b94b9d8996e9c4d4fdfab0576fcee049356dff5defd85f1a71ab652c41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5968e3b94b9d8996e9c4d4fdfab0576fcee049356dff5defd85f1a71ab652c41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05bd18aaa2469fc7380f98a513907e098a1cd45c794dae35894dc4caccaaeac8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05bd18aaa2469fc7380f98a513907e098a1cd45c794dae35894dc4caccaaeac8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c47b55214c6082bb9f8a18705983f9be95ef4c3b557d2d8f6cb8a33fa1fddd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c47b55214c6082bb9f8a18705983f9be95ef4c3b557d2d8f6cb8a33fa1fddd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jlg97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:49Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:49 crc kubenswrapper[4698]: I0224 10:18:49.614387 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:18:49 crc kubenswrapper[4698]: I0224 10:18:49.614518 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:18:49 crc kubenswrapper[4698]: E0224 10:18:49.615125 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:18:49 crc kubenswrapper[4698]: E0224 10:18:49.615022 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:18:49 crc kubenswrapper[4698]: I0224 10:18:49.623853 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"066df704-6981-4770-a647-df52a0da50a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60215d9a7dc3fbaa1b045a76c018c910f3748c5bef5325716e0a28844bc91ece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e27ae8c6aa803d58f6ff0252273d2fcbbee794c49a13fc54bfe6677b5aa6e07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7adc5b73bdd01b1e822308534c8848e154a1d05ed5367b971b59a99289387585\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2ec337c851d86c491d1ae5a667e4344ae4759f945b423d3a48838874a6eda20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://444da705b890c795bca82d2bd44ad5b71ed9bcc95a70ee5c92755679af31aa99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://096010abeb5f4fc1cf8ab2a1a3e50000365a449d0747081df923bde1be7e1213\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4f0637ffd869edc84aea294e257ec525bede2fdb6f95377ebe6bf3fb1033d71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fd1af0c59642907aa55721d60c59e0870d5597da7bdf99f8248f852ea5e393c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-24T10:18:21Z\\\",\\\"message\\\":\\\"/multus-additional-cni-plugins-jlg97\\\\nI0224 10:18:21.771147 6785 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI0224 10:18:21.771034 6785 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0224 10:18:21.771150 6785 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-mgh7p\\\\nI0224 10:18:21.771169 6785 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nF0224 10:18:21.771174 6785 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not y\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:18:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:18:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1288272246b8937c2880153451d797fc3328749902e2491e60c8f8f086c85288\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://363eade2263b2108feaaf0620f7f1fd910effb90ce635e5b749b59b407618443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://363eade2263b2108feaaf0620f7f1fd910effb90ce635e5b749b59b407618443\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mgh7p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:49Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:49 crc kubenswrapper[4698]: I0224 10:18:49.634345 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mb4d7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc3c474c-e869-4b47-94c5-f1ab3ce3c843\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d49238acba0219497644e528a1e99906b8e7e5d4a61033354fa8b7b9708b5e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d8kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mb4d7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:49Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:49 crc kubenswrapper[4698]: I0224 10:18:49.647657 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e2735c5-8b7a-424e-ba7f-8fe39da1e460\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2be309f6fc6bdf6f229b4a6ee32621f1385e3addb1c6655f4ee94a9e0f07e7e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29426bbee48683a1da8ffc61612543b337ccf61119a3617bcbbb475f75dac606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://532e8024f6ab49ca211330f56da50af0f46daf4569ad97723d35aa97076cde4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cadab6d7d113b12b60104344c27a04acf451f6627c3d62ad17b9132d63b6e974\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cadab6d7d113b12b60104344c27a04acf451f6627c3d62ad17b9132d63b6e974\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:16:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:16:16Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:16:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:49Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:49 crc kubenswrapper[4698]: I0224 10:18:49.662527 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bhrhk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d24b42-65c5-4a01-8f4a-6f970714ab76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93b33a3866385dfb6006f052ecde4b52df1dad342d6392f0935f548b610c26e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-knwn8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://892da9f80566a48c6ace1fb4d7a16d824aad789a4ae631728a01c22a8d7b04f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-knwn8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:18:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bhrhk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:49Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:49 crc kubenswrapper[4698]: I0224 10:18:49.674377 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34fd32d5-5aed-4abb-bf14-ab1b8b02b516\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b9d9ca2f4ccd094b55e3e27cef8afddae5dc7de81912aba64ca6a6671f14a35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42a2655047e1fb057b615781d8c2ccf50f62f2a70749ef8bb214d32edaba2b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e1bb75600de7e41c8a04ba010078c753b55d05aae7a18f945c2027ba48ee30c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa3d4a95fd60ff55d1850deb923135ed607172e7676a141a5d52e6cdd60b23bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64b39341e105fbe8aa9dc4c108f6ee8a2bff33568a205e32e639b8382ab2ccb2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T10:17:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0224 10:17:08.346350 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0224 10:17:08.346447 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0224 10:17:08.346900 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705878618/tls.crt::/tmp/serving-cert-3705878618/tls.key\\\\\\\"\\\\nI0224 10:17:08.624012 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0224 10:17:08.625525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0224 10:17:08.625540 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0224 10:17:08.625560 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0224 10:17:08.625565 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0224 10:17:08.629654 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0224 10:17:08.629666 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0224 10:17:08.629711 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:17:08.629725 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:17:08.629739 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0224 10:17:08.629749 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0224 10:17:08.629758 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0224 10:17:08.629766 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0224 10:17:08.630467 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://674ed085a7507742c61fdb7dae4678b08e315a3679788c5dcbb4df97cdc27c61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9e1b116db9c76dec99d1ac4af98e5ee081f2a171a19093ba5628b676356f34b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9e1b116db9c76dec99d1ac4af98e5ee081f2a171a19093ba5628b676356f34b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:16:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:16:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:16:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:49Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:49 crc kubenswrapper[4698]: I0224 10:18:49.683895 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-29rvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9cba56db-d46e-4a34-9863-47e4dce27ca5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f62a06c2933f02c75637172be87adadd015a2aad2750f553bb2e99c38fbec74b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fk9xv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-29rvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:49Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:50 crc kubenswrapper[4698]: I0224 10:18:50.420048 4698 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mgh7p_066df704-6981-4770-a647-df52a0da50a0/ovnkube-controller/3.log" Feb 24 10:18:50 crc kubenswrapper[4698]: I0224 10:18:50.421351 4698 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mgh7p_066df704-6981-4770-a647-df52a0da50a0/ovnkube-controller/2.log" Feb 24 10:18:50 crc kubenswrapper[4698]: I0224 10:18:50.425626 4698 generic.go:334] "Generic (PLEG): container finished" podID="066df704-6981-4770-a647-df52a0da50a0" containerID="b4f0637ffd869edc84aea294e257ec525bede2fdb6f95377ebe6bf3fb1033d71" exitCode=1 Feb 24 10:18:50 crc kubenswrapper[4698]: I0224 10:18:50.425712 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" event={"ID":"066df704-6981-4770-a647-df52a0da50a0","Type":"ContainerDied","Data":"b4f0637ffd869edc84aea294e257ec525bede2fdb6f95377ebe6bf3fb1033d71"} Feb 24 10:18:50 crc kubenswrapper[4698]: I0224 10:18:50.425803 4698 scope.go:117] "RemoveContainer" containerID="0fd1af0c59642907aa55721d60c59e0870d5597da7bdf99f8248f852ea5e393c" Feb 24 10:18:50 crc kubenswrapper[4698]: I0224 10:18:50.426870 4698 scope.go:117] "RemoveContainer" containerID="b4f0637ffd869edc84aea294e257ec525bede2fdb6f95377ebe6bf3fb1033d71" Feb 24 10:18:50 crc kubenswrapper[4698]: E0224 10:18:50.427153 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-mgh7p_openshift-ovn-kubernetes(066df704-6981-4770-a647-df52a0da50a0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" podUID="066df704-6981-4770-a647-df52a0da50a0" Feb 24 10:18:50 crc kubenswrapper[4698]: I0224 10:18:50.453639 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9c80bb7-e413-45b4-8845-2b54d65b6529\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ce842ec12984cffb63c49d9c2964440e503b1225036922d25e238b978b26130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a06990c16f9a0312f24771d4bfbbedeebbf5063afb8daaccfc4d17f60d641f5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1349d6c8aff311d876b61e13793a708952def1ba52ba669fcf8a99b27ba7db5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c63acc4dd56ca511d6de2f69a1f60dc53516cf4883c0355e1de373ae7fe0807f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69d7559f437e1a17b2ab3498c72ef428df69dfcc6827f78dd1edbc4a8251b5f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96955e70c81698ab59580428c999d2bc6a50b712c569961169488e58f1702878\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96955e70c81698ab59580428c999d2bc6a50b712c569961169488e58f1702878\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:16:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:16:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ac6f358ab1233bc8d572a403f57ad949ec6e10df4c56b9d4b535362a0f639e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ac6f358ab1233bc8d572a403f57ad949ec6e10df4c56b9d4b535362a0f639e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:16:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:16:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6cba8547d85b5fb437d90d455b42654df8d8663b592dee40a0982427c2f98547\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cba8547d85b5fb437d90d455b42654df8d8663b592dee40a0982427c2f98547\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:16:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:16:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:50Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:50 crc kubenswrapper[4698]: I0224 10:18:50.469108 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:50Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:50 crc kubenswrapper[4698]: I0224 10:18:50.488359 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4539d49e9935099b59be97e672ffbe6a2a831b9261939a5afba45e16aab5c2dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:50Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:50 crc kubenswrapper[4698]: I0224 10:18:50.503392 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-nn578" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4ee0bb1-125d-4852-a54d-7dadf6177545\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67e08c23594b195088f0a11823556880d9f809097ec231acf6c4ddbcf5c085b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9ngd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0c8bc2bc5ebfb2472863808bf33f95f8aa74ed45b546ed1a1b3be4883af700e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9ngd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-nn578\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:50Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:50 crc kubenswrapper[4698]: I0224 10:18:50.520127 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7mbk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17dd9ce8-b1ca-4810-85fe-9775919eb4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26cc85a7a79119a1df0de0f47a3098d7417118ce0da5b300f453a3d8c4f351a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac059400b5a17e1f1dc36d2fe35b5c8ace2dad5326f3933873eae644e1786c54\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-24T10:18:39Z\\\",\\\"message\\\":\\\"2026-02-24T10:17:53+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_04a40885-9d91-49e5-a993-91473cb3b04d\\\\n2026-02-24T10:17:53+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_04a40885-9d91-49e5-a993-91473cb3b04d to /host/opt/cni/bin/\\\\n2026-02-24T10:17:54Z [verbose] multus-daemon started\\\\n2026-02-24T10:17:54Z [verbose] Readiness Indicator file check\\\\n2026-02-24T10:18:39Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:18:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgnjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7mbk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:50Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:50 crc kubenswrapper[4698]: I0224 10:18:50.532711 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rpnnm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17a1338b-6385-4795-9397-74316d6599d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7xll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7xll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:18:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rpnnm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:50Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:50 crc kubenswrapper[4698]: I0224 10:18:50.545971 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mb4d7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc3c474c-e869-4b47-94c5-f1ab3ce3c843\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d49238acba0219497644e528a1e99906b8e7e5d4a61033354fa8b7b9708b5e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d8kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mb4d7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:50Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:50 crc kubenswrapper[4698]: I0224 10:18:50.565900 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e2735c5-8b7a-424e-ba7f-8fe39da1e460\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2be309f6fc6bdf6f229b4a6ee32621f1385e3addb1c6655f4ee94a9e0f07e7e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29426bbee48683a1da8ffc61612543b337ccf61119a3617bcbbb475f75dac606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://532e8024f6ab49ca211330f56da50af0f46daf4569ad97723d35aa97076cde4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cadab6d7d113b12b60104344c27a04acf451f6627c3d62ad17b9132d63b6e974\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cadab6d7d113b12b60104344c27a04acf451f6627c3d62ad17b9132d63b6e974\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:16:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:16:16Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:16:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:50Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:50 crc kubenswrapper[4698]: I0224 10:18:50.582093 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b70223850a461f607af8055fb157db676ed4dd9537481c41f21b8b85dc955c6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:50Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:50 crc kubenswrapper[4698]: I0224 10:18:50.595294 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e70623bb6b1c9ba54ae662592cd2861cea4181853f6595a595390c81820c287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://863cee3a2b2acf3e3138d4e13d27a2b4229d619661f97eab920e5a4ee7ae2c51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:50Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:50 crc kubenswrapper[4698]: I0224 10:18:50.607283 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:50Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:50 crc kubenswrapper[4698]: I0224 10:18:50.614149 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpnnm" Feb 24 10:18:50 crc kubenswrapper[4698]: I0224 10:18:50.614152 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:18:50 crc kubenswrapper[4698]: E0224 10:18:50.614255 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpnnm" podUID="17a1338b-6385-4795-9397-74316d6599d9" Feb 24 10:18:50 crc kubenswrapper[4698]: E0224 10:18:50.614389 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:18:50 crc kubenswrapper[4698]: I0224 10:18:50.621948 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jlg97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90062989-bf1b-4479-89a0-f3bf0d438ac3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd966a1dd77be4accb00f38133ee9df9a0f98df5050d51996c9547a95c361cfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f570e20898252544de2e4987e3ec3baea2d46904749fc01664c969518d8babd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f570e20898252544de2e4987e3ec3baea2d46904749fc01664c969518d8babd6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86844171c4cdeecffa4831f9bba9b6d9c5eecbcc2220f880ccdb8819df60fa34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86844171c4cdeecffa4831f9bba9b6d9c5eecbcc2220f880ccdb8819df60fa34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42705a048e7832b1de855a97691620e572a7a7f38b90148e1cedd49003c649fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42705a048e7832b1de855a97691620e572a7a7f38b90148e1cedd49003c649fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5968e3b94b9d8996e9c4d4fdfab0576fcee049356dff5defd85f1a71ab652c41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5968e3b94b9d8996e9c4d4fdfab0576fcee049356dff5defd85f1a71ab652c41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05bd18aaa2469fc7380f98a513907e098a1cd45c794dae35894dc4caccaaeac8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05bd18aaa2469fc7380f98a513907e098a1cd45c794dae35894dc4caccaaeac8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c47b55214c6082bb9f8a18705983f9be95ef4c3b557d2d8f6cb8a33fa1fddd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c47b55214c6082bb9f8a18705983f9be95ef4c3b557d2d8f6cb8a33fa1fddd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jlg97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:50Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:50 crc kubenswrapper[4698]: I0224 10:18:50.642768 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"066df704-6981-4770-a647-df52a0da50a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60215d9a7dc3fbaa1b045a76c018c910f3748c5bef5325716e0a28844bc91ece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e27ae8c6aa803d58f6ff0252273d2fcbbee794c49a13fc54bfe6677b5aa6e07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7adc5b73bdd01b1e822308534c8848e154a1d05ed5367b971b59a99289387585\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2ec337c851d86c491d1ae5a667e4344ae4759f945b423d3a48838874a6eda20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://444da705b890c795bca82d2bd44ad5b71ed9bcc95a70ee5c92755679af31aa99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://096010abeb5f4fc1cf8ab2a1a3e50000365a449d0747081df923bde1be7e1213\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4f0637ffd869edc84aea294e257ec525bede2fdb6f95377ebe6bf3fb1033d71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0fd1af0c59642907aa55721d60c59e0870d5597da7bdf99f8248f852ea5e393c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-24T10:18:21Z\\\",\\\"message\\\":\\\"/multus-additional-cni-plugins-jlg97\\\\nI0224 10:18:21.771147 6785 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI0224 10:18:21.771034 6785 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0224 10:18:21.771150 6785 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-mgh7p\\\\nI0224 10:18:21.771169 6785 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nF0224 10:18:21.771174 6785 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not y\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:18:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4f0637ffd869edc84aea294e257ec525bede2fdb6f95377ebe6bf3fb1033d71\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-24T10:18:49Z\\\",\\\"message\\\":\\\"oing to retry *v1.Pod resource setup for 1 objects: [openshift-multus/network-metrics-daemon-rpnnm]\\\\nI0224 10:18:49.620874 7113 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0224 10:18:49.620813 7113 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0224 10:18:49.620920 7113 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0224 10:18:49.620917 7113 obj_retry.go:285] Attempting retry of *v1.Pod openshift-multus/network-metrics-daemon-rpnnm before timer (time: 2026-02-24 10:18:50.983596124 +0000 UTC m=+2.005448810): skip\\\\nI0224 10:18:49.620957 7113 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0224 10:18:49.620960 7113 obj_retry.go:420] Function iterateRetryResources for *v1.Pod ended (in 99.992µs)\\\\nI0224 10:18:49.621052 7113 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0224 10:18:49.621117 7113 factory.go:656] Stopping watch factory\\\\nI0224 10:18:49.621156 7113 handler.go:208] Removed *v1.Node event handler 2\\\\nI0224 10:18:49.621172 7113 ovnkube.go:599] Stopped ovnkube\\\\nI0224 10:18:49.621195 7113 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0224 10:18:49.621323 7113 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:18:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1288272246b8937c2880153451d797fc3328749902e2491e60c8f8f086c85288\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://363eade2263b2108feaaf0620f7f1fd910effb90ce635e5b749b59b407618443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://363eade2263b2108feaaf0620f7f1fd910effb90ce635e5b749b59b407618443\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mgh7p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:50Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:50 crc kubenswrapper[4698]: I0224 10:18:50.656542 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bhrhk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d24b42-65c5-4a01-8f4a-6f970714ab76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93b33a3866385dfb6006f052ecde4b52df1dad342d6392f0935f548b610c26e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-knwn8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://892da9f80566a48c6ace1fb4d7a16d824aad789a4ae631728a01c22a8d7b04f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-knwn8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:18:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bhrhk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:50Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:50 crc kubenswrapper[4698]: I0224 10:18:50.677731 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34fd32d5-5aed-4abb-bf14-ab1b8b02b516\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b9d9ca2f4ccd094b55e3e27cef8afddae5dc7de81912aba64ca6a6671f14a35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42a2655047e1fb057b615781d8c2ccf50f62f2a70749ef8bb214d32edaba2b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e1bb75600de7e41c8a04ba010078c753b55d05aae7a18f945c2027ba48ee30c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa3d4a95fd60ff55d1850deb923135ed607172e7676a141a5d52e6cdd60b23bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64b39341e105fbe8aa9dc4c108f6ee8a2bff33568a205e32e639b8382ab2ccb2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T10:17:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0224 10:17:08.346350 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0224 10:17:08.346447 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0224 10:17:08.346900 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705878618/tls.crt::/tmp/serving-cert-3705878618/tls.key\\\\\\\"\\\\nI0224 10:17:08.624012 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0224 10:17:08.625525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0224 10:17:08.625540 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0224 10:17:08.625560 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0224 10:17:08.625565 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0224 10:17:08.629654 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0224 10:17:08.629666 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0224 10:17:08.629711 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:17:08.629725 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:17:08.629739 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0224 10:17:08.629749 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0224 10:17:08.629758 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0224 10:17:08.629766 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0224 10:17:08.630467 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://674ed085a7507742c61fdb7dae4678b08e315a3679788c5dcbb4df97cdc27c61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9e1b116db9c76dec99d1ac4af98e5ee081f2a171a19093ba5628b676356f34b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9e1b116db9c76dec99d1ac4af98e5ee081f2a171a19093ba5628b676356f34b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:16:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:16:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:16:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:50Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:50 crc kubenswrapper[4698]: I0224 10:18:50.693377 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-29rvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9cba56db-d46e-4a34-9863-47e4dce27ca5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f62a06c2933f02c75637172be87adadd015a2aad2750f553bb2e99c38fbec74b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fk9xv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-29rvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:50Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:50 crc kubenswrapper[4698]: I0224 10:18:50.714468 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff15454f-f3f9-4740-ba7f-141fc467f2bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23d201f106cf9fbd3bd2821755ea1fd87709b24155eebfab4f687defd0fd60bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://686acda68f64175c520efc4054df6bcfd32b2c98a3d8134d32e252d265520338\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T10:16:46Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0224 10:16:17.940332 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0224 10:16:17.942929 1 observer_polling.go:159] Starting file observer\\\\nI0224 10:16:18.009451 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0224 10:16:18.013067 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0224 10:16:46.691402 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0224 10:16:46.691560 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:16:17Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0a9191217045254bf454800fc32d325cc4450d0d4d0d9b6fb4bd6a438872cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9ee3d8391b55fa37cff72ad555ec89f4b12b8b5ef765979d929da0ae7cbb052\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b45bc6035a33d5e9841bd5791aeb2521dd1f93616396be15bef77dc6f5af97cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:16:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:50Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:50 crc kubenswrapper[4698]: I0224 10:18:50.734209 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:50Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:50 crc kubenswrapper[4698]: E0224 10:18:50.745180 4698 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 24 10:18:51 crc kubenswrapper[4698]: I0224 10:18:51.431958 4698 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mgh7p_066df704-6981-4770-a647-df52a0da50a0/ovnkube-controller/3.log" Feb 24 10:18:51 crc kubenswrapper[4698]: I0224 10:18:51.614077 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:18:51 crc kubenswrapper[4698]: I0224 10:18:51.614132 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:18:51 crc kubenswrapper[4698]: E0224 10:18:51.614219 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:18:51 crc kubenswrapper[4698]: E0224 10:18:51.614395 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:18:52 crc kubenswrapper[4698]: I0224 10:18:52.574460 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" Feb 24 10:18:52 crc kubenswrapper[4698]: I0224 10:18:52.575884 4698 scope.go:117] "RemoveContainer" containerID="b4f0637ffd869edc84aea294e257ec525bede2fdb6f95377ebe6bf3fb1033d71" Feb 24 10:18:52 crc kubenswrapper[4698]: E0224 10:18:52.576304 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-mgh7p_openshift-ovn-kubernetes(066df704-6981-4770-a647-df52a0da50a0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" podUID="066df704-6981-4770-a647-df52a0da50a0" Feb 24 10:18:52 crc kubenswrapper[4698]: I0224 10:18:52.600946 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff15454f-f3f9-4740-ba7f-141fc467f2bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23d201f106cf9fbd3bd2821755ea1fd87709b24155eebfab4f687defd0fd60bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://686acda68f64175c520efc4054df6bcfd32b2c98a3d8134d32e252d265520338\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T10:16:46Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0224 10:16:17.940332 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0224 10:16:17.942929 1 observer_polling.go:159] Starting file observer\\\\nI0224 10:16:18.009451 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0224 10:16:18.013067 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0224 10:16:46.691402 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0224 10:16:46.691560 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:16:17Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0a9191217045254bf454800fc32d325cc4450d0d4d0d9b6fb4bd6a438872cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9ee3d8391b55fa37cff72ad555ec89f4b12b8b5ef765979d929da0ae7cbb052\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b45bc6035a33d5e9841bd5791aeb2521dd1f93616396be15bef77dc6f5af97cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:16:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:52Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:52 crc kubenswrapper[4698]: I0224 10:18:52.614165 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpnnm" Feb 24 10:18:52 crc kubenswrapper[4698]: I0224 10:18:52.614174 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:18:52 crc kubenswrapper[4698]: E0224 10:18:52.614336 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpnnm" podUID="17a1338b-6385-4795-9397-74316d6599d9" Feb 24 10:18:52 crc kubenswrapper[4698]: E0224 10:18:52.614508 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:18:52 crc kubenswrapper[4698]: I0224 10:18:52.667094 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:52Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:52 crc kubenswrapper[4698]: I0224 10:18:52.688773 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4539d49e9935099b59be97e672ffbe6a2a831b9261939a5afba45e16aab5c2dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:52Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:52 crc kubenswrapper[4698]: I0224 10:18:52.705187 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-nn578" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4ee0bb1-125d-4852-a54d-7dadf6177545\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67e08c23594b195088f0a11823556880d9f809097ec231acf6c4ddbcf5c085b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9ngd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0c8bc2bc5ebfb2472863808bf33f95f8aa74ed45b546ed1a1b3be4883af700e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9ngd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-nn578\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:52Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:52 crc kubenswrapper[4698]: I0224 10:18:52.719255 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7mbk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17dd9ce8-b1ca-4810-85fe-9775919eb4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26cc85a7a79119a1df0de0f47a3098d7417118ce0da5b300f453a3d8c4f351a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac059400b5a17e1f1dc36d2fe35b5c8ace2dad5326f3933873eae644e1786c54\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-24T10:18:39Z\\\",\\\"message\\\":\\\"2026-02-24T10:17:53+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_04a40885-9d91-49e5-a993-91473cb3b04d\\\\n2026-02-24T10:17:53+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_04a40885-9d91-49e5-a993-91473cb3b04d to /host/opt/cni/bin/\\\\n2026-02-24T10:17:54Z [verbose] multus-daemon started\\\\n2026-02-24T10:17:54Z [verbose] Readiness Indicator file check\\\\n2026-02-24T10:18:39Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:18:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgnjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7mbk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:52Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:52 crc kubenswrapper[4698]: I0224 10:18:52.734089 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rpnnm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17a1338b-6385-4795-9397-74316d6599d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7xll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7xll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:18:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rpnnm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:52Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:52 crc kubenswrapper[4698]: I0224 10:18:52.764968 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9c80bb7-e413-45b4-8845-2b54d65b6529\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ce842ec12984cffb63c49d9c2964440e503b1225036922d25e238b978b26130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a06990c16f9a0312f24771d4bfbbedeebbf5063afb8daaccfc4d17f60d641f5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1349d6c8aff311d876b61e13793a708952def1ba52ba669fcf8a99b27ba7db5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c63acc4dd56ca511d6de2f69a1f60dc53516cf4883c0355e1de373ae7fe0807f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69d7559f437e1a17b2ab3498c72ef428df69dfcc6827f78dd1edbc4a8251b5f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96955e70c81698ab59580428c999d2bc6a50b712c569961169488e58f1702878\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96955e70c81698ab59580428c999d2bc6a50b712c569961169488e58f1702878\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:16:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:16:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ac6f358ab1233bc8d572a403f57ad949ec6e10df4c56b9d4b535362a0f639e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ac6f358ab1233bc8d572a403f57ad949ec6e10df4c56b9d4b535362a0f639e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:16:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:16:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6cba8547d85b5fb437d90d455b42654df8d8663b592dee40a0982427c2f98547\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cba8547d85b5fb437d90d455b42654df8d8663b592dee40a0982427c2f98547\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:16:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:16:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:52Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:52 crc kubenswrapper[4698]: I0224 10:18:52.785507 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:52Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:52 crc kubenswrapper[4698]: I0224 10:18:52.801974 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b70223850a461f607af8055fb157db676ed4dd9537481c41f21b8b85dc955c6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:52Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:52 crc kubenswrapper[4698]: I0224 10:18:52.818697 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e70623bb6b1c9ba54ae662592cd2861cea4181853f6595a595390c81820c287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://863cee3a2b2acf3e3138d4e13d27a2b4229d619661f97eab920e5a4ee7ae2c51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:52Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:52 crc kubenswrapper[4698]: I0224 10:18:52.836714 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:52Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:52 crc kubenswrapper[4698]: I0224 10:18:52.857333 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jlg97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90062989-bf1b-4479-89a0-f3bf0d438ac3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd966a1dd77be4accb00f38133ee9df9a0f98df5050d51996c9547a95c361cfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f570e20898252544de2e4987e3ec3baea2d46904749fc01664c969518d8babd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f570e20898252544de2e4987e3ec3baea2d46904749fc01664c969518d8babd6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86844171c4cdeecffa4831f9bba9b6d9c5eecbcc2220f880ccdb8819df60fa34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86844171c4cdeecffa4831f9bba9b6d9c5eecbcc2220f880ccdb8819df60fa34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42705a048e7832b1de855a97691620e572a7a7f38b90148e1cedd49003c649fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42705a048e7832b1de855a97691620e572a7a7f38b90148e1cedd49003c649fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5968e3b94b9d8996e9c4d4fdfab0576fcee049356dff5defd85f1a71ab652c41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5968e3b94b9d8996e9c4d4fdfab0576fcee049356dff5defd85f1a71ab652c41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05bd18aaa2469fc7380f98a513907e098a1cd45c794dae35894dc4caccaaeac8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05bd18aaa2469fc7380f98a513907e098a1cd45c794dae35894dc4caccaaeac8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c47b55214c6082bb9f8a18705983f9be95ef4c3b557d2d8f6cb8a33fa1fddd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c47b55214c6082bb9f8a18705983f9be95ef4c3b557d2d8f6cb8a33fa1fddd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jlg97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:52Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:52 crc kubenswrapper[4698]: I0224 10:18:52.885053 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"066df704-6981-4770-a647-df52a0da50a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60215d9a7dc3fbaa1b045a76c018c910f3748c5bef5325716e0a28844bc91ece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e27ae8c6aa803d58f6ff0252273d2fcbbee794c49a13fc54bfe6677b5aa6e07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7adc5b73bdd01b1e822308534c8848e154a1d05ed5367b971b59a99289387585\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2ec337c851d86c491d1ae5a667e4344ae4759f945b423d3a48838874a6eda20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://444da705b890c795bca82d2bd44ad5b71ed9bcc95a70ee5c92755679af31aa99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://096010abeb5f4fc1cf8ab2a1a3e50000365a449d0747081df923bde1be7e1213\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4f0637ffd869edc84aea294e257ec525bede2fdb6f95377ebe6bf3fb1033d71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4f0637ffd869edc84aea294e257ec525bede2fdb6f95377ebe6bf3fb1033d71\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-24T10:18:49Z\\\",\\\"message\\\":\\\"oing to retry *v1.Pod resource setup for 1 objects: [openshift-multus/network-metrics-daemon-rpnnm]\\\\nI0224 10:18:49.620874 7113 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0224 10:18:49.620813 7113 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0224 10:18:49.620920 7113 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0224 10:18:49.620917 7113 obj_retry.go:285] Attempting retry of *v1.Pod openshift-multus/network-metrics-daemon-rpnnm before timer (time: 2026-02-24 10:18:50.983596124 +0000 UTC m=+2.005448810): skip\\\\nI0224 10:18:49.620957 7113 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0224 10:18:49.620960 7113 obj_retry.go:420] Function iterateRetryResources for *v1.Pod ended (in 99.992µs)\\\\nI0224 10:18:49.621052 7113 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0224 10:18:49.621117 7113 factory.go:656] Stopping watch factory\\\\nI0224 10:18:49.621156 7113 handler.go:208] Removed *v1.Node event handler 2\\\\nI0224 10:18:49.621172 7113 ovnkube.go:599] Stopped ovnkube\\\\nI0224 10:18:49.621195 7113 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0224 10:18:49.621323 7113 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:18:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-mgh7p_openshift-ovn-kubernetes(066df704-6981-4770-a647-df52a0da50a0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1288272246b8937c2880153451d797fc3328749902e2491e60c8f8f086c85288\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://363eade2263b2108feaaf0620f7f1fd910effb90ce635e5b749b59b407618443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://363eade2263b2108feaaf0620f7f1fd910effb90ce635e5b749b59b407618443\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mgh7p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:52Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:52 crc kubenswrapper[4698]: I0224 10:18:52.900901 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mb4d7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc3c474c-e869-4b47-94c5-f1ab3ce3c843\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d49238acba0219497644e528a1e99906b8e7e5d4a61033354fa8b7b9708b5e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d8kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mb4d7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:52Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:52 crc kubenswrapper[4698]: I0224 10:18:52.918745 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e2735c5-8b7a-424e-ba7f-8fe39da1e460\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2be309f6fc6bdf6f229b4a6ee32621f1385e3addb1c6655f4ee94a9e0f07e7e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29426bbee48683a1da8ffc61612543b337ccf61119a3617bcbbb475f75dac606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://532e8024f6ab49ca211330f56da50af0f46daf4569ad97723d35aa97076cde4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cadab6d7d113b12b60104344c27a04acf451f6627c3d62ad17b9132d63b6e974\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cadab6d7d113b12b60104344c27a04acf451f6627c3d62ad17b9132d63b6e974\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:16:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:16:16Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:16:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:52Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:52 crc kubenswrapper[4698]: I0224 10:18:52.933666 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bhrhk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d24b42-65c5-4a01-8f4a-6f970714ab76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93b33a3866385dfb6006f052ecde4b52df1dad342d6392f0935f548b610c26e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-knwn8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://892da9f80566a48c6ace1fb4d7a16d824aad789a4ae631728a01c22a8d7b04f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-knwn8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:18:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bhrhk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:52Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:52 crc kubenswrapper[4698]: I0224 10:18:52.954231 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34fd32d5-5aed-4abb-bf14-ab1b8b02b516\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b9d9ca2f4ccd094b55e3e27cef8afddae5dc7de81912aba64ca6a6671f14a35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42a2655047e1fb057b615781d8c2ccf50f62f2a70749ef8bb214d32edaba2b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e1bb75600de7e41c8a04ba010078c753b55d05aae7a18f945c2027ba48ee30c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa3d4a95fd60ff55d1850deb923135ed607172e7676a141a5d52e6cdd60b23bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64b39341e105fbe8aa9dc4c108f6ee8a2bff33568a205e32e639b8382ab2ccb2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T10:17:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0224 10:17:08.346350 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0224 10:17:08.346447 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0224 10:17:08.346900 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705878618/tls.crt::/tmp/serving-cert-3705878618/tls.key\\\\\\\"\\\\nI0224 10:17:08.624012 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0224 10:17:08.625525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0224 10:17:08.625540 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0224 10:17:08.625560 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0224 10:17:08.625565 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0224 10:17:08.629654 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0224 10:17:08.629666 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0224 10:17:08.629711 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:17:08.629725 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:17:08.629739 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0224 10:17:08.629749 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0224 10:17:08.629758 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0224 10:17:08.629766 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0224 10:17:08.630467 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://674ed085a7507742c61fdb7dae4678b08e315a3679788c5dcbb4df97cdc27c61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9e1b116db9c76dec99d1ac4af98e5ee081f2a171a19093ba5628b676356f34b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9e1b116db9c76dec99d1ac4af98e5ee081f2a171a19093ba5628b676356f34b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:16:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:16:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:16:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:52Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:52 crc kubenswrapper[4698]: I0224 10:18:52.969710 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-29rvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9cba56db-d46e-4a34-9863-47e4dce27ca5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f62a06c2933f02c75637172be87adadd015a2aad2750f553bb2e99c38fbec74b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fk9xv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-29rvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:52Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:53 crc kubenswrapper[4698]: I0224 10:18:53.614565 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:18:53 crc kubenswrapper[4698]: I0224 10:18:53.614589 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:18:53 crc kubenswrapper[4698]: E0224 10:18:53.614802 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:18:53 crc kubenswrapper[4698]: E0224 10:18:53.614886 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:18:54 crc kubenswrapper[4698]: I0224 10:18:54.613750 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:18:54 crc kubenswrapper[4698]: I0224 10:18:54.613790 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpnnm" Feb 24 10:18:54 crc kubenswrapper[4698]: E0224 10:18:54.613970 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:18:54 crc kubenswrapper[4698]: E0224 10:18:54.614084 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpnnm" podUID="17a1338b-6385-4795-9397-74316d6599d9" Feb 24 10:18:55 crc kubenswrapper[4698]: I0224 10:18:55.529532 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:18:55 crc kubenswrapper[4698]: I0224 10:18:55.529740 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:18:55 crc kubenswrapper[4698]: I0224 10:18:55.529811 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:18:55 crc kubenswrapper[4698]: E0224 10:18:55.529901 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:19:59.529852006 +0000 UTC m=+224.643466287 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:18:55 crc kubenswrapper[4698]: E0224 10:18:55.529944 4698 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 24 10:18:55 crc kubenswrapper[4698]: E0224 10:18:55.530043 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-24 10:19:59.530014319 +0000 UTC m=+224.643628630 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 24 10:18:55 crc kubenswrapper[4698]: E0224 10:18:55.530046 4698 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 24 10:18:55 crc kubenswrapper[4698]: E0224 10:18:55.530131 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-24 10:19:59.530112182 +0000 UTC m=+224.643726453 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 24 10:18:55 crc kubenswrapper[4698]: I0224 10:18:55.614633 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:18:55 crc kubenswrapper[4698]: E0224 10:18:55.614825 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:18:55 crc kubenswrapper[4698]: I0224 10:18:55.614984 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:18:55 crc kubenswrapper[4698]: E0224 10:18:55.615147 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:18:55 crc kubenswrapper[4698]: I0224 10:18:55.630514 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:18:55 crc kubenswrapper[4698]: I0224 10:18:55.630589 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:18:55 crc kubenswrapper[4698]: E0224 10:18:55.630721 4698 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 24 10:18:55 crc kubenswrapper[4698]: E0224 10:18:55.630738 4698 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 24 10:18:55 crc kubenswrapper[4698]: E0224 10:18:55.630753 4698 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 10:18:55 crc kubenswrapper[4698]: E0224 10:18:55.630810 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-24 10:19:59.630792892 +0000 UTC m=+224.744407143 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 10:18:55 crc kubenswrapper[4698]: E0224 10:18:55.631073 4698 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 24 10:18:55 crc kubenswrapper[4698]: E0224 10:18:55.631168 4698 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 24 10:18:55 crc kubenswrapper[4698]: E0224 10:18:55.631239 4698 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 10:18:55 crc kubenswrapper[4698]: E0224 10:18:55.631438 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-24 10:19:59.631370547 +0000 UTC m=+224.744984828 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 24 10:18:55 crc kubenswrapper[4698]: I0224 10:18:55.632239 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff15454f-f3f9-4740-ba7f-141fc467f2bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23d201f106cf9fbd3bd2821755ea1fd87709b24155eebfab4f687defd0fd60bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://686acda68f64175c520efc4054df6bcfd32b2c98a3d8134d32e252d265520338\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T10:16:46Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0224 10:16:17.940332 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0224 10:16:17.942929 1 observer_polling.go:159] Starting file observer\\\\nI0224 10:16:18.009451 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0224 10:16:18.013067 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0224 10:16:46.691402 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0224 10:16:46.691560 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:16:17Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0a9191217045254bf454800fc32d325cc4450d0d4d0d9b6fb4bd6a438872cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9ee3d8391b55fa37cff72ad555ec89f4b12b8b5ef765979d929da0ae7cbb052\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b45bc6035a33d5e9841bd5791aeb2521dd1f93616396be15bef77dc6f5af97cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:16:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:55Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:55 crc kubenswrapper[4698]: I0224 10:18:55.645600 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:55Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:55 crc kubenswrapper[4698]: I0224 10:18:55.658239 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rpnnm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17a1338b-6385-4795-9397-74316d6599d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7xll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7xll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:18:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rpnnm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:55Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:55 crc kubenswrapper[4698]: I0224 10:18:55.679860 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9c80bb7-e413-45b4-8845-2b54d65b6529\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ce842ec12984cffb63c49d9c2964440e503b1225036922d25e238b978b26130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a06990c16f9a0312f24771d4bfbbedeebbf5063afb8daaccfc4d17f60d641f5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1349d6c8aff311d876b61e13793a708952def1ba52ba669fcf8a99b27ba7db5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c63acc4dd56ca511d6de2f69a1f60dc53516cf4883c0355e1de373ae7fe0807f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69d7559f437e1a17b2ab3498c72ef428df69dfcc6827f78dd1edbc4a8251b5f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96955e70c81698ab59580428c999d2bc6a50b712c569961169488e58f1702878\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96955e70c81698ab59580428c999d2bc6a50b712c569961169488e58f1702878\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:16:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:16:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ac6f358ab1233bc8d572a403f57ad949ec6e10df4c56b9d4b535362a0f639e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ac6f358ab1233bc8d572a403f57ad949ec6e10df4c56b9d4b535362a0f639e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:16:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:16:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6cba8547d85b5fb437d90d455b42654df8d8663b592dee40a0982427c2f98547\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cba8547d85b5fb437d90d455b42654df8d8663b592dee40a0982427c2f98547\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:16:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:16:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:55Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:55 crc kubenswrapper[4698]: I0224 10:18:55.692236 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:55Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:55 crc kubenswrapper[4698]: I0224 10:18:55.708057 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4539d49e9935099b59be97e672ffbe6a2a831b9261939a5afba45e16aab5c2dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:55Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:55 crc kubenswrapper[4698]: I0224 10:18:55.720050 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-nn578" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4ee0bb1-125d-4852-a54d-7dadf6177545\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67e08c23594b195088f0a11823556880d9f809097ec231acf6c4ddbcf5c085b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9ngd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0c8bc2bc5ebfb2472863808bf33f95f8aa74ed45b546ed1a1b3be4883af700e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9ngd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-nn578\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:55Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:55 crc kubenswrapper[4698]: I0224 10:18:55.736002 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7mbk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17dd9ce8-b1ca-4810-85fe-9775919eb4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26cc85a7a79119a1df0de0f47a3098d7417118ce0da5b300f453a3d8c4f351a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac059400b5a17e1f1dc36d2fe35b5c8ace2dad5326f3933873eae644e1786c54\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-24T10:18:39Z\\\",\\\"message\\\":\\\"2026-02-24T10:17:53+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_04a40885-9d91-49e5-a993-91473cb3b04d\\\\n2026-02-24T10:17:53+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_04a40885-9d91-49e5-a993-91473cb3b04d to /host/opt/cni/bin/\\\\n2026-02-24T10:17:54Z [verbose] multus-daemon started\\\\n2026-02-24T10:17:54Z [verbose] Readiness Indicator file check\\\\n2026-02-24T10:18:39Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:18:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgnjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7mbk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:55Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:55 crc kubenswrapper[4698]: E0224 10:18:55.745593 4698 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 24 10:18:55 crc kubenswrapper[4698]: I0224 10:18:55.758940 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jlg97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90062989-bf1b-4479-89a0-f3bf0d438ac3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd966a1dd77be4accb00f38133ee9df9a0f98df5050d51996c9547a95c361cfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f570e20898252544de2e4987e3ec3baea2d46904749fc01664c969518d8babd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f570e20898252544de2e4987e3ec3baea2d46904749fc01664c969518d8babd6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86844171c4cdeecffa4831f9bba9b6d9c5eecbcc2220f880ccdb8819df60fa34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86844171c4cdeecffa4831f9bba9b6d9c5eecbcc2220f880ccdb8819df60fa34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42705a048e7832b1de855a97691620e572a7a7f38b90148e1cedd49003c649fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42705a048e7832b1de855a97691620e572a7a7f38b90148e1cedd49003c649fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5968e3b94b9d8996e9c4d4fdfab0576fcee049356dff5defd85f1a71ab652c41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5968e3b94b9d8996e9c4d4fdfab0576fcee049356dff5defd85f1a71ab652c41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05bd18aaa2469fc7380f98a513907e098a1cd45c794dae35894dc4caccaaeac8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05bd18aaa2469fc7380f98a513907e098a1cd45c794dae35894dc4caccaaeac8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c47b55214c6082bb9f8a18705983f9be95ef4c3b557d2d8f6cb8a33fa1fddd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c47b55214c6082bb9f8a18705983f9be95ef4c3b557d2d8f6cb8a33fa1fddd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jlg97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:55Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:55 crc kubenswrapper[4698]: I0224 10:18:55.788374 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"066df704-6981-4770-a647-df52a0da50a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60215d9a7dc3fbaa1b045a76c018c910f3748c5bef5325716e0a28844bc91ece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e27ae8c6aa803d58f6ff0252273d2fcbbee794c49a13fc54bfe6677b5aa6e07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7adc5b73bdd01b1e822308534c8848e154a1d05ed5367b971b59a99289387585\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2ec337c851d86c491d1ae5a667e4344ae4759f945b423d3a48838874a6eda20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://444da705b890c795bca82d2bd44ad5b71ed9bcc95a70ee5c92755679af31aa99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://096010abeb5f4fc1cf8ab2a1a3e50000365a449d0747081df923bde1be7e1213\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4f0637ffd869edc84aea294e257ec525bede2fdb6f95377ebe6bf3fb1033d71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4f0637ffd869edc84aea294e257ec525bede2fdb6f95377ebe6bf3fb1033d71\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-24T10:18:49Z\\\",\\\"message\\\":\\\"oing to retry *v1.Pod resource setup for 1 objects: [openshift-multus/network-metrics-daemon-rpnnm]\\\\nI0224 10:18:49.620874 7113 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0224 10:18:49.620813 7113 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0224 10:18:49.620920 7113 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0224 10:18:49.620917 7113 obj_retry.go:285] Attempting retry of *v1.Pod openshift-multus/network-metrics-daemon-rpnnm before timer (time: 2026-02-24 10:18:50.983596124 +0000 UTC m=+2.005448810): skip\\\\nI0224 10:18:49.620957 7113 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0224 10:18:49.620960 7113 obj_retry.go:420] Function iterateRetryResources for *v1.Pod ended (in 99.992µs)\\\\nI0224 10:18:49.621052 7113 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0224 10:18:49.621117 7113 factory.go:656] Stopping watch factory\\\\nI0224 10:18:49.621156 7113 handler.go:208] Removed *v1.Node event handler 2\\\\nI0224 10:18:49.621172 7113 ovnkube.go:599] Stopped ovnkube\\\\nI0224 10:18:49.621195 7113 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0224 10:18:49.621323 7113 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:18:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-mgh7p_openshift-ovn-kubernetes(066df704-6981-4770-a647-df52a0da50a0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1288272246b8937c2880153451d797fc3328749902e2491e60c8f8f086c85288\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://363eade2263b2108feaaf0620f7f1fd910effb90ce635e5b749b59b407618443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://363eade2263b2108feaaf0620f7f1fd910effb90ce635e5b749b59b407618443\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mgh7p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:55Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:55 crc kubenswrapper[4698]: I0224 10:18:55.801279 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mb4d7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc3c474c-e869-4b47-94c5-f1ab3ce3c843\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d49238acba0219497644e528a1e99906b8e7e5d4a61033354fa8b7b9708b5e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d8kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mb4d7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:55Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:55 crc kubenswrapper[4698]: I0224 10:18:55.816925 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e2735c5-8b7a-424e-ba7f-8fe39da1e460\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2be309f6fc6bdf6f229b4a6ee32621f1385e3addb1c6655f4ee94a9e0f07e7e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29426bbee48683a1da8ffc61612543b337ccf61119a3617bcbbb475f75dac606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://532e8024f6ab49ca211330f56da50af0f46daf4569ad97723d35aa97076cde4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cadab6d7d113b12b60104344c27a04acf451f6627c3d62ad17b9132d63b6e974\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cadab6d7d113b12b60104344c27a04acf451f6627c3d62ad17b9132d63b6e974\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:16:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:16:16Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:16:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:55Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:55 crc kubenswrapper[4698]: I0224 10:18:55.834367 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b70223850a461f607af8055fb157db676ed4dd9537481c41f21b8b85dc955c6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:55Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:55 crc kubenswrapper[4698]: I0224 10:18:55.852862 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e70623bb6b1c9ba54ae662592cd2861cea4181853f6595a595390c81820c287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://863cee3a2b2acf3e3138d4e13d27a2b4229d619661f97eab920e5a4ee7ae2c51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:55Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:55 crc kubenswrapper[4698]: I0224 10:18:55.872010 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:55Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:55 crc kubenswrapper[4698]: I0224 10:18:55.889449 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bhrhk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d24b42-65c5-4a01-8f4a-6f970714ab76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93b33a3866385dfb6006f052ecde4b52df1dad342d6392f0935f548b610c26e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-knwn8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://892da9f80566a48c6ace1fb4d7a16d824aad789a4ae631728a01c22a8d7b04f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-knwn8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:18:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bhrhk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:55Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:55 crc kubenswrapper[4698]: I0224 10:18:55.908925 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34fd32d5-5aed-4abb-bf14-ab1b8b02b516\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b9d9ca2f4ccd094b55e3e27cef8afddae5dc7de81912aba64ca6a6671f14a35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42a2655047e1fb057b615781d8c2ccf50f62f2a70749ef8bb214d32edaba2b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e1bb75600de7e41c8a04ba010078c753b55d05aae7a18f945c2027ba48ee30c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa3d4a95fd60ff55d1850deb923135ed607172e7676a141a5d52e6cdd60b23bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64b39341e105fbe8aa9dc4c108f6ee8a2bff33568a205e32e639b8382ab2ccb2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T10:17:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0224 10:17:08.346350 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0224 10:17:08.346447 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0224 10:17:08.346900 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705878618/tls.crt::/tmp/serving-cert-3705878618/tls.key\\\\\\\"\\\\nI0224 10:17:08.624012 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0224 10:17:08.625525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0224 10:17:08.625540 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0224 10:17:08.625560 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0224 10:17:08.625565 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0224 10:17:08.629654 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0224 10:17:08.629666 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0224 10:17:08.629711 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:17:08.629725 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:17:08.629739 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0224 10:17:08.629749 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0224 10:17:08.629758 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0224 10:17:08.629766 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0224 10:17:08.630467 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://674ed085a7507742c61fdb7dae4678b08e315a3679788c5dcbb4df97cdc27c61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9e1b116db9c76dec99d1ac4af98e5ee081f2a171a19093ba5628b676356f34b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9e1b116db9c76dec99d1ac4af98e5ee081f2a171a19093ba5628b676356f34b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:16:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:16:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:16:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:55Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:55 crc kubenswrapper[4698]: I0224 10:18:55.920181 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-29rvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9cba56db-d46e-4a34-9863-47e4dce27ca5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f62a06c2933f02c75637172be87adadd015a2aad2750f553bb2e99c38fbec74b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fk9xv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-29rvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:55Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:56 crc kubenswrapper[4698]: I0224 10:18:56.614642 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpnnm" Feb 24 10:18:56 crc kubenswrapper[4698]: E0224 10:18:56.615089 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpnnm" podUID="17a1338b-6385-4795-9397-74316d6599d9" Feb 24 10:18:56 crc kubenswrapper[4698]: I0224 10:18:56.614719 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:18:56 crc kubenswrapper[4698]: E0224 10:18:56.615376 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:18:57 crc kubenswrapper[4698]: I0224 10:18:57.614670 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:18:57 crc kubenswrapper[4698]: I0224 10:18:57.614764 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:18:57 crc kubenswrapper[4698]: E0224 10:18:57.614950 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:18:57 crc kubenswrapper[4698]: E0224 10:18:57.615086 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:18:58 crc kubenswrapper[4698]: I0224 10:18:58.614754 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:18:58 crc kubenswrapper[4698]: I0224 10:18:58.614762 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpnnm" Feb 24 10:18:58 crc kubenswrapper[4698]: E0224 10:18:58.614974 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:18:58 crc kubenswrapper[4698]: E0224 10:18:58.615101 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpnnm" podUID="17a1338b-6385-4795-9397-74316d6599d9" Feb 24 10:18:59 crc kubenswrapper[4698]: I0224 10:18:59.000725 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:59 crc kubenswrapper[4698]: I0224 10:18:59.000774 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:59 crc kubenswrapper[4698]: I0224 10:18:59.000789 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:59 crc kubenswrapper[4698]: I0224 10:18:59.000809 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:59 crc kubenswrapper[4698]: I0224 10:18:59.000825 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:59Z","lastTransitionTime":"2026-02-24T10:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:59 crc kubenswrapper[4698]: E0224 10:18:59.022391 4698 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b118f46-32f0-479c-9931-37b2bbb76922\\\",\\\"systemUUID\\\":\\\"b9d2441b-c8c3-476a-9c48-bba682d9b98e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:59Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:59 crc kubenswrapper[4698]: I0224 10:18:59.027234 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:59 crc kubenswrapper[4698]: I0224 10:18:59.027331 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:59 crc kubenswrapper[4698]: I0224 10:18:59.027350 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:59 crc kubenswrapper[4698]: I0224 10:18:59.027376 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:59 crc kubenswrapper[4698]: I0224 10:18:59.027393 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:59Z","lastTransitionTime":"2026-02-24T10:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:59 crc kubenswrapper[4698]: E0224 10:18:59.047343 4698 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b118f46-32f0-479c-9931-37b2bbb76922\\\",\\\"systemUUID\\\":\\\"b9d2441b-c8c3-476a-9c48-bba682d9b98e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:59Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:59 crc kubenswrapper[4698]: I0224 10:18:59.052385 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:59 crc kubenswrapper[4698]: I0224 10:18:59.052432 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:59 crc kubenswrapper[4698]: I0224 10:18:59.052443 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:59 crc kubenswrapper[4698]: I0224 10:18:59.052459 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:59 crc kubenswrapper[4698]: I0224 10:18:59.052472 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:59Z","lastTransitionTime":"2026-02-24T10:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:59 crc kubenswrapper[4698]: E0224 10:18:59.070971 4698 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b118f46-32f0-479c-9931-37b2bbb76922\\\",\\\"systemUUID\\\":\\\"b9d2441b-c8c3-476a-9c48-bba682d9b98e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:59Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:59 crc kubenswrapper[4698]: I0224 10:18:59.074790 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:59 crc kubenswrapper[4698]: I0224 10:18:59.074833 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:59 crc kubenswrapper[4698]: I0224 10:18:59.074846 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:59 crc kubenswrapper[4698]: I0224 10:18:59.074869 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:59 crc kubenswrapper[4698]: I0224 10:18:59.074882 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:59Z","lastTransitionTime":"2026-02-24T10:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:59 crc kubenswrapper[4698]: E0224 10:18:59.087166 4698 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b118f46-32f0-479c-9931-37b2bbb76922\\\",\\\"systemUUID\\\":\\\"b9d2441b-c8c3-476a-9c48-bba682d9b98e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:59Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:59 crc kubenswrapper[4698]: I0224 10:18:59.091680 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:18:59 crc kubenswrapper[4698]: I0224 10:18:59.091770 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:18:59 crc kubenswrapper[4698]: I0224 10:18:59.091781 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:18:59 crc kubenswrapper[4698]: I0224 10:18:59.091796 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:18:59 crc kubenswrapper[4698]: I0224 10:18:59.091805 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:18:59Z","lastTransitionTime":"2026-02-24T10:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:18:59 crc kubenswrapper[4698]: E0224 10:18:59.109960 4698 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:18:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b118f46-32f0-479c-9931-37b2bbb76922\\\",\\\"systemUUID\\\":\\\"b9d2441b-c8c3-476a-9c48-bba682d9b98e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:18:59Z is after 2025-08-24T17:21:41Z" Feb 24 10:18:59 crc kubenswrapper[4698]: E0224 10:18:59.110073 4698 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 24 10:18:59 crc kubenswrapper[4698]: I0224 10:18:59.614085 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:18:59 crc kubenswrapper[4698]: E0224 10:18:59.614395 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:18:59 crc kubenswrapper[4698]: I0224 10:18:59.614083 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:18:59 crc kubenswrapper[4698]: E0224 10:18:59.614948 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:19:00 crc kubenswrapper[4698]: I0224 10:19:00.614156 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:19:00 crc kubenswrapper[4698]: I0224 10:19:00.614198 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpnnm" Feb 24 10:19:00 crc kubenswrapper[4698]: E0224 10:19:00.614346 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:19:00 crc kubenswrapper[4698]: E0224 10:19:00.614495 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpnnm" podUID="17a1338b-6385-4795-9397-74316d6599d9" Feb 24 10:19:00 crc kubenswrapper[4698]: E0224 10:19:00.747455 4698 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 24 10:19:01 crc kubenswrapper[4698]: I0224 10:19:01.615504 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:19:01 crc kubenswrapper[4698]: I0224 10:19:01.615597 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:19:01 crc kubenswrapper[4698]: E0224 10:19:01.615748 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:19:01 crc kubenswrapper[4698]: E0224 10:19:01.615871 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:19:02 crc kubenswrapper[4698]: I0224 10:19:02.614683 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:19:02 crc kubenswrapper[4698]: I0224 10:19:02.614746 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpnnm" Feb 24 10:19:02 crc kubenswrapper[4698]: E0224 10:19:02.614882 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:19:02 crc kubenswrapper[4698]: E0224 10:19:02.615160 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpnnm" podUID="17a1338b-6385-4795-9397-74316d6599d9" Feb 24 10:19:03 crc kubenswrapper[4698]: I0224 10:19:03.614067 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:19:03 crc kubenswrapper[4698]: I0224 10:19:03.614078 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:19:03 crc kubenswrapper[4698]: E0224 10:19:03.614249 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:19:03 crc kubenswrapper[4698]: E0224 10:19:03.614336 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:19:04 crc kubenswrapper[4698]: I0224 10:19:04.614066 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:19:04 crc kubenswrapper[4698]: E0224 10:19:04.614245 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:19:04 crc kubenswrapper[4698]: I0224 10:19:04.614316 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpnnm" Feb 24 10:19:04 crc kubenswrapper[4698]: E0224 10:19:04.615065 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpnnm" podUID="17a1338b-6385-4795-9397-74316d6599d9" Feb 24 10:19:04 crc kubenswrapper[4698]: I0224 10:19:04.615609 4698 scope.go:117] "RemoveContainer" containerID="b4f0637ffd869edc84aea294e257ec525bede2fdb6f95377ebe6bf3fb1033d71" Feb 24 10:19:04 crc kubenswrapper[4698]: E0224 10:19:04.615901 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-mgh7p_openshift-ovn-kubernetes(066df704-6981-4770-a647-df52a0da50a0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" podUID="066df704-6981-4770-a647-df52a0da50a0" Feb 24 10:19:05 crc kubenswrapper[4698]: I0224 10:19:05.614082 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:19:05 crc kubenswrapper[4698]: E0224 10:19:05.614385 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:19:05 crc kubenswrapper[4698]: I0224 10:19:05.614517 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:19:05 crc kubenswrapper[4698]: E0224 10:19:05.614744 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:19:05 crc kubenswrapper[4698]: I0224 10:19:05.628055 4698 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Feb 24 10:19:05 crc kubenswrapper[4698]: I0224 10:19:05.636320 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"34fd32d5-5aed-4abb-bf14-ab1b8b02b516\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6b9d9ca2f4ccd094b55e3e27cef8afddae5dc7de81912aba64ca6a6671f14a35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42a2655047e1fb057b615781d8c2ccf50f62f2a70749ef8bb214d32edaba2b1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e1bb75600de7e41c8a04ba010078c753b55d05aae7a18f945c2027ba48ee30c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa3d4a95fd60ff55d1850deb923135ed607172e7676a141a5d52e6cdd60b23bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://64b39341e105fbe8aa9dc4c108f6ee8a2bff33568a205e32e639b8382ab2ccb2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T10:17:08Z\\\",\\\"message\\\":\\\"le observer\\\\nW0224 10:17:08.346350 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0224 10:17:08.346447 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0224 10:17:08.346900 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705878618/tls.crt::/tmp/serving-cert-3705878618/tls.key\\\\\\\"\\\\nI0224 10:17:08.624012 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0224 10:17:08.625525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0224 10:17:08.625540 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0224 10:17:08.625560 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0224 10:17:08.625565 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0224 10:17:08.629654 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0224 10:17:08.629666 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0224 10:17:08.629711 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:17:08.629725 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0224 10:17:08.629739 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0224 10:17:08.629749 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0224 10:17:08.629758 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0224 10:17:08.629766 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0224 10:17:08.630467 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:07Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:18:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://674ed085a7507742c61fdb7dae4678b08e315a3679788c5dcbb4df97cdc27c61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:18Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c9e1b116db9c76dec99d1ac4af98e5ee081f2a171a19093ba5628b676356f34b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9e1b116db9c76dec99d1ac4af98e5ee081f2a171a19093ba5628b676356f34b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:16:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:16:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:16:15Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:19:05Z is after 2025-08-24T17:21:41Z" Feb 24 10:19:05 crc kubenswrapper[4698]: I0224 10:19:05.652885 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-29rvz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9cba56db-d46e-4a34-9863-47e4dce27ca5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f62a06c2933f02c75637172be87adadd015a2aad2750f553bb2e99c38fbec74b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fk9xv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-29rvz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:19:05Z is after 2025-08-24T17:21:41Z" Feb 24 10:19:05 crc kubenswrapper[4698]: I0224 10:19:05.673895 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ff15454f-f3f9-4740-ba7f-141fc467f2bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23d201f106cf9fbd3bd2821755ea1fd87709b24155eebfab4f687defd0fd60bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://686acda68f64175c520efc4054df6bcfd32b2c98a3d8134d32e252d265520338\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-24T10:16:46Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0224 10:16:17.940332 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0224 10:16:17.942929 1 observer_polling.go:159] Starting file observer\\\\nI0224 10:16:18.009451 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0224 10:16:18.013067 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0224 10:16:46.691402 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0224 10:16:46.691560 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:16:17Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0a9191217045254bf454800fc32d325cc4450d0d4d0d9b6fb4bd6a438872cd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9ee3d8391b55fa37cff72ad555ec89f4b12b8b5ef765979d929da0ae7cbb052\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b45bc6035a33d5e9841bd5791aeb2521dd1f93616396be15bef77dc6f5af97cc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:16:15Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:19:05Z is after 2025-08-24T17:21:41Z" Feb 24 10:19:05 crc kubenswrapper[4698]: I0224 10:19:05.693902 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:19:05Z is after 2025-08-24T17:21:41Z" Feb 24 10:19:05 crc kubenswrapper[4698]: E0224 10:19:05.748012 4698 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 24 10:19:05 crc kubenswrapper[4698]: I0224 10:19:05.769392 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e9c80bb7-e413-45b4-8845-2b54d65b6529\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ce842ec12984cffb63c49d9c2964440e503b1225036922d25e238b978b26130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a06990c16f9a0312f24771d4bfbbedeebbf5063afb8daaccfc4d17f60d641f5f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1349d6c8aff311d876b61e13793a708952def1ba52ba669fcf8a99b27ba7db5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c63acc4dd56ca511d6de2f69a1f60dc53516cf4883c0355e1de373ae7fe0807f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://69d7559f437e1a17b2ab3498c72ef428df69dfcc6827f78dd1edbc4a8251b5f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96955e70c81698ab59580428c999d2bc6a50b712c569961169488e58f1702878\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96955e70c81698ab59580428c999d2bc6a50b712c569961169488e58f1702878\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:16:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:16:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ac6f358ab1233bc8d572a403f57ad949ec6e10df4c56b9d4b535362a0f639e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ac6f358ab1233bc8d572a403f57ad949ec6e10df4c56b9d4b535362a0f639e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:16:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:16:17Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://6cba8547d85b5fb437d90d455b42654df8d8663b592dee40a0982427c2f98547\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6cba8547d85b5fb437d90d455b42654df8d8663b592dee40a0982427c2f98547\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:16:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:16:15Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:19:05Z is after 2025-08-24T17:21:41Z" Feb 24 10:19:05 crc kubenswrapper[4698]: I0224 10:19:05.788310 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:19:05Z is after 2025-08-24T17:21:41Z" Feb 24 10:19:05 crc kubenswrapper[4698]: I0224 10:19:05.804039 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4539d49e9935099b59be97e672ffbe6a2a831b9261939a5afba45e16aab5c2dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:19:05Z is after 2025-08-24T17:21:41Z" Feb 24 10:19:05 crc kubenswrapper[4698]: I0224 10:19:05.817915 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-nn578" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4ee0bb1-125d-4852-a54d-7dadf6177545\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e67e08c23594b195088f0a11823556880d9f809097ec231acf6c4ddbcf5c085b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9ngd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0c8bc2bc5ebfb2472863808bf33f95f8aa74ed45b546ed1a1b3be4883af700e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9ngd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-nn578\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:19:05Z is after 2025-08-24T17:21:41Z" Feb 24 10:19:05 crc kubenswrapper[4698]: I0224 10:19:05.835520 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7mbk6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17dd9ce8-b1ca-4810-85fe-9775919eb4b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26cc85a7a79119a1df0de0f47a3098d7417118ce0da5b300f453a3d8c4f351a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac059400b5a17e1f1dc36d2fe35b5c8ace2dad5326f3933873eae644e1786c54\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-24T10:18:39Z\\\",\\\"message\\\":\\\"2026-02-24T10:17:53+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_04a40885-9d91-49e5-a993-91473cb3b04d\\\\n2026-02-24T10:17:53+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_04a40885-9d91-49e5-a993-91473cb3b04d to /host/opt/cni/bin/\\\\n2026-02-24T10:17:54Z [verbose] multus-daemon started\\\\n2026-02-24T10:17:54Z [verbose] Readiness Indicator file check\\\\n2026-02-24T10:18:39Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:18:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tgnjd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7mbk6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:19:05Z is after 2025-08-24T17:21:41Z" Feb 24 10:19:05 crc kubenswrapper[4698]: I0224 10:19:05.849853 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-rpnnm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"17a1338b-6385-4795-9397-74316d6599d9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7xll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t7xll\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:18:05Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-rpnnm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:19:05Z is after 2025-08-24T17:21:41Z" Feb 24 10:19:05 crc kubenswrapper[4698]: I0224 10:19:05.866118 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-mb4d7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc3c474c-e869-4b47-94c5-f1ab3ce3c843\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d49238acba0219497644e528a1e99906b8e7e5d4a61033354fa8b7b9708b5e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2d8kb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:58Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-mb4d7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:19:05Z is after 2025-08-24T17:21:41Z" Feb 24 10:19:05 crc kubenswrapper[4698]: I0224 10:19:05.881391 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e2735c5-8b7a-424e-ba7f-8fe39da1e460\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:16:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2be309f6fc6bdf6f229b4a6ee32621f1385e3addb1c6655f4ee94a9e0f07e7e7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29426bbee48683a1da8ffc61612543b337ccf61119a3617bcbbb475f75dac606\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://532e8024f6ab49ca211330f56da50af0f46daf4569ad97723d35aa97076cde4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:16:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cadab6d7d113b12b60104344c27a04acf451f6627c3d62ad17b9132d63b6e974\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cadab6d7d113b12b60104344c27a04acf451f6627c3d62ad17b9132d63b6e974\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:16:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:16:16Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:16:15Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:19:05Z is after 2025-08-24T17:21:41Z" Feb 24 10:19:05 crc kubenswrapper[4698]: I0224 10:19:05.896977 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:56Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:56Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b70223850a461f607af8055fb157db676ed4dd9537481c41f21b8b85dc955c6f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:19:05Z is after 2025-08-24T17:21:41Z" Feb 24 10:19:05 crc kubenswrapper[4698]: I0224 10:19:05.913750 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e70623bb6b1c9ba54ae662592cd2861cea4181853f6595a595390c81820c287\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://863cee3a2b2acf3e3138d4e13d27a2b4229d619661f97eab920e5a4ee7ae2c51\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:19:05Z is after 2025-08-24T17:21:41Z" Feb 24 10:19:05 crc kubenswrapper[4698]: I0224 10:19:05.931824 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:51Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:19:05Z is after 2025-08-24T17:21:41Z" Feb 24 10:19:05 crc kubenswrapper[4698]: I0224 10:19:05.952202 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jlg97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90062989-bf1b-4479-89a0-f3bf0d438ac3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dd966a1dd77be4accb00f38133ee9df9a0f98df5050d51996c9547a95c361cfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f570e20898252544de2e4987e3ec3baea2d46904749fc01664c969518d8babd6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f570e20898252544de2e4987e3ec3baea2d46904749fc01664c969518d8babd6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86844171c4cdeecffa4831f9bba9b6d9c5eecbcc2220f880ccdb8819df60fa34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86844171c4cdeecffa4831f9bba9b6d9c5eecbcc2220f880ccdb8819df60fa34\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://42705a048e7832b1de855a97691620e572a7a7f38b90148e1cedd49003c649fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://42705a048e7832b1de855a97691620e572a7a7f38b90148e1cedd49003c649fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5968e3b94b9d8996e9c4d4fdfab0576fcee049356dff5defd85f1a71ab652c41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5968e3b94b9d8996e9c4d4fdfab0576fcee049356dff5defd85f1a71ab652c41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://05bd18aaa2469fc7380f98a513907e098a1cd45c794dae35894dc4caccaaeac8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://05bd18aaa2469fc7380f98a513907e098a1cd45c794dae35894dc4caccaaeac8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c47b55214c6082bb9f8a18705983f9be95ef4c3b557d2d8f6cb8a33fa1fddd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c47b55214c6082bb9f8a18705983f9be95ef4c3b557d2d8f6cb8a33fa1fddd2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxqkd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:51Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jlg97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:19:05Z is after 2025-08-24T17:21:41Z" Feb 24 10:19:05 crc kubenswrapper[4698]: I0224 10:19:05.976292 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"066df704-6981-4770-a647-df52a0da50a0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60215d9a7dc3fbaa1b045a76c018c910f3748c5bef5325716e0a28844bc91ece\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e27ae8c6aa803d58f6ff0252273d2fcbbee794c49a13fc54bfe6677b5aa6e07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7adc5b73bdd01b1e822308534c8848e154a1d05ed5367b971b59a99289387585\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2ec337c851d86c491d1ae5a667e4344ae4759f945b423d3a48838874a6eda20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://444da705b890c795bca82d2bd44ad5b71ed9bcc95a70ee5c92755679af31aa99\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://096010abeb5f4fc1cf8ab2a1a3e50000365a449d0747081df923bde1be7e1213\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4f0637ffd869edc84aea294e257ec525bede2fdb6f95377ebe6bf3fb1033d71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4f0637ffd869edc84aea294e257ec525bede2fdb6f95377ebe6bf3fb1033d71\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-24T10:18:49Z\\\",\\\"message\\\":\\\"oing to retry *v1.Pod resource setup for 1 objects: [openshift-multus/network-metrics-daemon-rpnnm]\\\\nI0224 10:18:49.620874 7113 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0224 10:18:49.620813 7113 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0224 10:18:49.620920 7113 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0224 10:18:49.620917 7113 obj_retry.go:285] Attempting retry of *v1.Pod openshift-multus/network-metrics-daemon-rpnnm before timer (time: 2026-02-24 10:18:50.983596124 +0000 UTC m=+2.005448810): skip\\\\nI0224 10:18:49.620957 7113 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0224 10:18:49.620960 7113 obj_retry.go:420] Function iterateRetryResources for *v1.Pod ended (in 99.992µs)\\\\nI0224 10:18:49.621052 7113 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0224 10:18:49.621117 7113 factory.go:656] Stopping watch factory\\\\nI0224 10:18:49.621156 7113 handler.go:208] Removed *v1.Node event handler 2\\\\nI0224 10:18:49.621172 7113 ovnkube.go:599] Stopped ovnkube\\\\nI0224 10:18:49.621195 7113 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0224 10:18:49.621323 7113 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-24T10:18:48Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-mgh7p_openshift-ovn-kubernetes(066df704-6981-4770-a647-df52a0da50a0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1288272246b8937c2880153451d797fc3328749902e2491e60c8f8f086c85288\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:17:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://363eade2263b2108feaaf0620f7f1fd910effb90ce635e5b749b59b407618443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://363eade2263b2108feaaf0620f7f1fd910effb90ce635e5b749b59b407618443\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-24T10:17:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-24T10:17:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-l2k2d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:17:52Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-mgh7p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:19:05Z is after 2025-08-24T17:21:41Z" Feb 24 10:19:05 crc kubenswrapper[4698]: I0224 10:19:05.990724 4698 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bhrhk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6d24b42-65c5-4a01-8f4a-6f970714ab76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-24T10:18:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93b33a3866385dfb6006f052ecde4b52df1dad342d6392f0935f548b610c26e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-knwn8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://892da9f80566a48c6ace1fb4d7a16d824aad789a4ae631728a01c22a8d7b04f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-24T10:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-knwn8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-24T10:18:04Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-bhrhk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:19:05Z is after 2025-08-24T17:21:41Z" Feb 24 10:19:06 crc kubenswrapper[4698]: I0224 10:19:06.614085 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:19:06 crc kubenswrapper[4698]: I0224 10:19:06.614175 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpnnm" Feb 24 10:19:06 crc kubenswrapper[4698]: E0224 10:19:06.614324 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:19:06 crc kubenswrapper[4698]: E0224 10:19:06.614380 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpnnm" podUID="17a1338b-6385-4795-9397-74316d6599d9" Feb 24 10:19:07 crc kubenswrapper[4698]: I0224 10:19:07.614713 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:19:07 crc kubenswrapper[4698]: I0224 10:19:07.614775 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:19:07 crc kubenswrapper[4698]: E0224 10:19:07.614876 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:19:07 crc kubenswrapper[4698]: E0224 10:19:07.614999 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:19:08 crc kubenswrapper[4698]: I0224 10:19:08.613987 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:19:08 crc kubenswrapper[4698]: I0224 10:19:08.614041 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpnnm" Feb 24 10:19:08 crc kubenswrapper[4698]: E0224 10:19:08.614234 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:19:08 crc kubenswrapper[4698]: E0224 10:19:08.614521 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpnnm" podUID="17a1338b-6385-4795-9397-74316d6599d9" Feb 24 10:19:09 crc kubenswrapper[4698]: I0224 10:19:09.247225 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:19:09 crc kubenswrapper[4698]: I0224 10:19:09.247286 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:19:09 crc kubenswrapper[4698]: I0224 10:19:09.247331 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:19:09 crc kubenswrapper[4698]: I0224 10:19:09.247355 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:19:09 crc kubenswrapper[4698]: I0224 10:19:09.247372 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:19:09Z","lastTransitionTime":"2026-02-24T10:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:19:09 crc kubenswrapper[4698]: E0224 10:19:09.269021 4698 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:19:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:19:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:19:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:19:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:19:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:19:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:19:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:19:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b118f46-32f0-479c-9931-37b2bbb76922\\\",\\\"systemUUID\\\":\\\"b9d2441b-c8c3-476a-9c48-bba682d9b98e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:19:09Z is after 2025-08-24T17:21:41Z" Feb 24 10:19:09 crc kubenswrapper[4698]: I0224 10:19:09.274650 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:19:09 crc kubenswrapper[4698]: I0224 10:19:09.274991 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:19:09 crc kubenswrapper[4698]: I0224 10:19:09.275239 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:19:09 crc kubenswrapper[4698]: I0224 10:19:09.275521 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:19:09 crc kubenswrapper[4698]: I0224 10:19:09.275777 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:19:09Z","lastTransitionTime":"2026-02-24T10:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:19:09 crc kubenswrapper[4698]: E0224 10:19:09.296641 4698 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:19:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:19:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:19:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:19:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:19:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:19:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:19:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:19:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b118f46-32f0-479c-9931-37b2bbb76922\\\",\\\"systemUUID\\\":\\\"b9d2441b-c8c3-476a-9c48-bba682d9b98e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:19:09Z is after 2025-08-24T17:21:41Z" Feb 24 10:19:09 crc kubenswrapper[4698]: I0224 10:19:09.303211 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:19:09 crc kubenswrapper[4698]: I0224 10:19:09.303256 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:19:09 crc kubenswrapper[4698]: I0224 10:19:09.303306 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:19:09 crc kubenswrapper[4698]: I0224 10:19:09.303324 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:19:09 crc kubenswrapper[4698]: I0224 10:19:09.303338 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:19:09Z","lastTransitionTime":"2026-02-24T10:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:19:09 crc kubenswrapper[4698]: E0224 10:19:09.323583 4698 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:19:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:19:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:19:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:19:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:19:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:19:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:19:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:19:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b118f46-32f0-479c-9931-37b2bbb76922\\\",\\\"systemUUID\\\":\\\"b9d2441b-c8c3-476a-9c48-bba682d9b98e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:19:09Z is after 2025-08-24T17:21:41Z" Feb 24 10:19:09 crc kubenswrapper[4698]: I0224 10:19:09.328120 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:19:09 crc kubenswrapper[4698]: I0224 10:19:09.328160 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:19:09 crc kubenswrapper[4698]: I0224 10:19:09.328171 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:19:09 crc kubenswrapper[4698]: I0224 10:19:09.328190 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:19:09 crc kubenswrapper[4698]: I0224 10:19:09.328203 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:19:09Z","lastTransitionTime":"2026-02-24T10:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:19:09 crc kubenswrapper[4698]: E0224 10:19:09.347567 4698 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:19:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:19:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:19:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:19:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:19:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:19:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:19:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:19:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b118f46-32f0-479c-9931-37b2bbb76922\\\",\\\"systemUUID\\\":\\\"b9d2441b-c8c3-476a-9c48-bba682d9b98e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:19:09Z is after 2025-08-24T17:21:41Z" Feb 24 10:19:09 crc kubenswrapper[4698]: I0224 10:19:09.352272 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:19:09 crc kubenswrapper[4698]: I0224 10:19:09.352340 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:19:09 crc kubenswrapper[4698]: I0224 10:19:09.352357 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:19:09 crc kubenswrapper[4698]: I0224 10:19:09.352377 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:19:09 crc kubenswrapper[4698]: I0224 10:19:09.352392 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:19:09Z","lastTransitionTime":"2026-02-24T10:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:19:09 crc kubenswrapper[4698]: E0224 10:19:09.371288 4698 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:19:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:19:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:19:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:19:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:19:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:19:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:19:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-24T10:19:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5b118f46-32f0-479c-9931-37b2bbb76922\\\",\\\"systemUUID\\\":\\\"b9d2441b-c8c3-476a-9c48-bba682d9b98e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-24T10:19:09Z is after 2025-08-24T17:21:41Z" Feb 24 10:19:09 crc kubenswrapper[4698]: E0224 10:19:09.371489 4698 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 24 10:19:09 crc kubenswrapper[4698]: I0224 10:19:09.375898 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/17a1338b-6385-4795-9397-74316d6599d9-metrics-certs\") pod \"network-metrics-daemon-rpnnm\" (UID: \"17a1338b-6385-4795-9397-74316d6599d9\") " pod="openshift-multus/network-metrics-daemon-rpnnm" Feb 24 10:19:09 crc kubenswrapper[4698]: E0224 10:19:09.376079 4698 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 24 10:19:09 crc kubenswrapper[4698]: E0224 10:19:09.376189 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/17a1338b-6385-4795-9397-74316d6599d9-metrics-certs podName:17a1338b-6385-4795-9397-74316d6599d9 nodeName:}" failed. No retries permitted until 2026-02-24 10:20:13.376163983 +0000 UTC m=+238.489778254 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/17a1338b-6385-4795-9397-74316d6599d9-metrics-certs") pod "network-metrics-daemon-rpnnm" (UID: "17a1338b-6385-4795-9397-74316d6599d9") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 24 10:19:09 crc kubenswrapper[4698]: I0224 10:19:09.614589 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:19:09 crc kubenswrapper[4698]: I0224 10:19:09.614583 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:19:09 crc kubenswrapper[4698]: E0224 10:19:09.614699 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:19:09 crc kubenswrapper[4698]: E0224 10:19:09.614882 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:19:10 crc kubenswrapper[4698]: I0224 10:19:10.614601 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpnnm" Feb 24 10:19:10 crc kubenswrapper[4698]: I0224 10:19:10.614640 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:19:10 crc kubenswrapper[4698]: E0224 10:19:10.614888 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpnnm" podUID="17a1338b-6385-4795-9397-74316d6599d9" Feb 24 10:19:10 crc kubenswrapper[4698]: E0224 10:19:10.615032 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:19:10 crc kubenswrapper[4698]: E0224 10:19:10.749960 4698 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 24 10:19:11 crc kubenswrapper[4698]: I0224 10:19:11.614112 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:19:11 crc kubenswrapper[4698]: I0224 10:19:11.614153 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:19:11 crc kubenswrapper[4698]: E0224 10:19:11.614333 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:19:11 crc kubenswrapper[4698]: E0224 10:19:11.614531 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:19:12 crc kubenswrapper[4698]: I0224 10:19:12.614348 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpnnm" Feb 24 10:19:12 crc kubenswrapper[4698]: E0224 10:19:12.614543 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpnnm" podUID="17a1338b-6385-4795-9397-74316d6599d9" Feb 24 10:19:12 crc kubenswrapper[4698]: I0224 10:19:12.614849 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:19:12 crc kubenswrapper[4698]: E0224 10:19:12.615700 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:19:13 crc kubenswrapper[4698]: I0224 10:19:13.614142 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:19:13 crc kubenswrapper[4698]: I0224 10:19:13.614163 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:19:13 crc kubenswrapper[4698]: E0224 10:19:13.614387 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:19:13 crc kubenswrapper[4698]: E0224 10:19:13.615548 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:19:14 crc kubenswrapper[4698]: I0224 10:19:14.614010 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpnnm" Feb 24 10:19:14 crc kubenswrapper[4698]: I0224 10:19:14.614661 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:19:14 crc kubenswrapper[4698]: E0224 10:19:14.614851 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpnnm" podUID="17a1338b-6385-4795-9397-74316d6599d9" Feb 24 10:19:14 crc kubenswrapper[4698]: E0224 10:19:14.614969 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:19:15 crc kubenswrapper[4698]: I0224 10:19:15.613993 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:19:15 crc kubenswrapper[4698]: E0224 10:19:15.614180 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:19:15 crc kubenswrapper[4698]: I0224 10:19:15.614369 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:19:15 crc kubenswrapper[4698]: E0224 10:19:15.614503 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:19:15 crc kubenswrapper[4698]: I0224 10:19:15.617115 4698 scope.go:117] "RemoveContainer" containerID="b4f0637ffd869edc84aea294e257ec525bede2fdb6f95377ebe6bf3fb1033d71" Feb 24 10:19:15 crc kubenswrapper[4698]: E0224 10:19:15.617648 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-mgh7p_openshift-ovn-kubernetes(066df704-6981-4770-a647-df52a0da50a0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" podUID="066df704-6981-4770-a647-df52a0da50a0" Feb 24 10:19:15 crc kubenswrapper[4698]: I0224 10:19:15.653704 4698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=59.653462003 podStartE2EDuration="59.653462003s" podCreationTimestamp="2026-02-24 10:18:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:19:15.652140371 +0000 UTC m=+180.765754642" watchObservedRunningTime="2026-02-24 10:19:15.653462003 +0000 UTC m=+180.767076294" Feb 24 10:19:15 crc kubenswrapper[4698]: I0224 10:19:15.728191 4698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=27.728171006 podStartE2EDuration="27.728171006s" podCreationTimestamp="2026-02-24 10:18:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:19:15.72712719 +0000 UTC m=+180.840741441" watchObservedRunningTime="2026-02-24 10:19:15.728171006 +0000 UTC m=+180.841785257" Feb 24 10:19:15 crc kubenswrapper[4698]: E0224 10:19:15.751022 4698 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 24 10:19:15 crc kubenswrapper[4698]: I0224 10:19:15.801177 4698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-nn578" podStartSLOduration=137.801145977 podStartE2EDuration="2m17.801145977s" podCreationTimestamp="2026-02-24 10:16:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:19:15.79668693 +0000 UTC m=+180.910301231" watchObservedRunningTime="2026-02-24 10:19:15.801145977 +0000 UTC m=+180.914760268" Feb 24 10:19:15 crc kubenswrapper[4698]: I0224 10:19:15.831123 4698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-7mbk6" podStartSLOduration=137.831108181 podStartE2EDuration="2m17.831108181s" podCreationTimestamp="2026-02-24 10:16:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:19:15.818289901 +0000 UTC m=+180.931904162" watchObservedRunningTime="2026-02-24 10:19:15.831108181 +0000 UTC m=+180.944722432" Feb 24 10:19:15 crc kubenswrapper[4698]: I0224 10:19:15.858562 4698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-mb4d7" podStartSLOduration=137.858542252 podStartE2EDuration="2m17.858542252s" podCreationTimestamp="2026-02-24 10:16:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:19:15.842640099 +0000 UTC m=+180.956254370" watchObservedRunningTime="2026-02-24 10:19:15.858542252 +0000 UTC m=+180.972156503" Feb 24 10:19:15 crc kubenswrapper[4698]: I0224 10:19:15.872248 4698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=43.872229233 podStartE2EDuration="43.872229233s" podCreationTimestamp="2026-02-24 10:18:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:19:15.858515032 +0000 UTC m=+180.972129313" watchObservedRunningTime="2026-02-24 10:19:15.872229233 +0000 UTC m=+180.985843484" Feb 24 10:19:15 crc kubenswrapper[4698]: I0224 10:19:15.872859 4698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=10.872852308 podStartE2EDuration="10.872852308s" podCreationTimestamp="2026-02-24 10:19:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:19:15.871822454 +0000 UTC m=+180.985436715" watchObservedRunningTime="2026-02-24 10:19:15.872852308 +0000 UTC m=+180.986466559" Feb 24 10:19:15 crc kubenswrapper[4698]: I0224 10:19:15.936384 4698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-jlg97" podStartSLOduration=137.936369611 podStartE2EDuration="2m17.936369611s" podCreationTimestamp="2026-02-24 10:16:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:19:15.935810397 +0000 UTC m=+181.049424638" watchObservedRunningTime="2026-02-24 10:19:15.936369611 +0000 UTC m=+181.049983852" Feb 24 10:19:15 crc kubenswrapper[4698]: I0224 10:19:15.971273 4698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-bhrhk" podStartSLOduration=136.971260784 podStartE2EDuration="2m16.971260784s" podCreationTimestamp="2026-02-24 10:16:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:19:15.969973853 +0000 UTC m=+181.083588094" watchObservedRunningTime="2026-02-24 10:19:15.971260784 +0000 UTC m=+181.084875025" Feb 24 10:19:15 crc kubenswrapper[4698]: I0224 10:19:15.990013 4698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=76.989997036 podStartE2EDuration="1m16.989997036s" podCreationTimestamp="2026-02-24 10:17:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:19:15.989351171 +0000 UTC m=+181.102965412" watchObservedRunningTime="2026-02-24 10:19:15.989997036 +0000 UTC m=+181.103611277" Feb 24 10:19:16 crc kubenswrapper[4698]: I0224 10:19:16.613965 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpnnm" Feb 24 10:19:16 crc kubenswrapper[4698]: I0224 10:19:16.614056 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:19:16 crc kubenswrapper[4698]: E0224 10:19:16.614143 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpnnm" podUID="17a1338b-6385-4795-9397-74316d6599d9" Feb 24 10:19:16 crc kubenswrapper[4698]: E0224 10:19:16.614312 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:19:17 crc kubenswrapper[4698]: I0224 10:19:17.614338 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:19:17 crc kubenswrapper[4698]: I0224 10:19:17.614358 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:19:17 crc kubenswrapper[4698]: E0224 10:19:17.614618 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:19:17 crc kubenswrapper[4698]: E0224 10:19:17.614754 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:19:18 crc kubenswrapper[4698]: I0224 10:19:18.614521 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:19:18 crc kubenswrapper[4698]: I0224 10:19:18.614593 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpnnm" Feb 24 10:19:18 crc kubenswrapper[4698]: E0224 10:19:18.614721 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:19:18 crc kubenswrapper[4698]: E0224 10:19:18.614866 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpnnm" podUID="17a1338b-6385-4795-9397-74316d6599d9" Feb 24 10:19:19 crc kubenswrapper[4698]: I0224 10:19:19.614926 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:19:19 crc kubenswrapper[4698]: I0224 10:19:19.614941 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:19:19 crc kubenswrapper[4698]: E0224 10:19:19.615177 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:19:19 crc kubenswrapper[4698]: E0224 10:19:19.615434 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:19:19 crc kubenswrapper[4698]: I0224 10:19:19.729221 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 24 10:19:19 crc kubenswrapper[4698]: I0224 10:19:19.729309 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 24 10:19:19 crc kubenswrapper[4698]: I0224 10:19:19.729323 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 24 10:19:19 crc kubenswrapper[4698]: I0224 10:19:19.729342 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 24 10:19:19 crc kubenswrapper[4698]: I0224 10:19:19.729353 4698 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-24T10:19:19Z","lastTransitionTime":"2026-02-24T10:19:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 24 10:19:19 crc kubenswrapper[4698]: I0224 10:19:19.794701 4698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-29rvz" podStartSLOduration=141.794670611 podStartE2EDuration="2m21.794670611s" podCreationTimestamp="2026-02-24 10:16:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:19:16.001080013 +0000 UTC m=+181.114694264" watchObservedRunningTime="2026-02-24 10:19:19.794670611 +0000 UTC m=+184.908284892" Feb 24 10:19:19 crc kubenswrapper[4698]: I0224 10:19:19.795856 4698 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-wgm2v"] Feb 24 10:19:19 crc kubenswrapper[4698]: I0224 10:19:19.796501 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wgm2v" Feb 24 10:19:19 crc kubenswrapper[4698]: I0224 10:19:19.800794 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 24 10:19:19 crc kubenswrapper[4698]: I0224 10:19:19.800979 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 24 10:19:19 crc kubenswrapper[4698]: I0224 10:19:19.801312 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 24 10:19:19 crc kubenswrapper[4698]: I0224 10:19:19.801368 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 24 10:19:19 crc kubenswrapper[4698]: I0224 10:19:19.941978 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/4df1851e-8573-46d9-b076-58e2d75d177b-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-wgm2v\" (UID: \"4df1851e-8573-46d9-b076-58e2d75d177b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wgm2v" Feb 24 10:19:19 crc kubenswrapper[4698]: I0224 10:19:19.942046 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4df1851e-8573-46d9-b076-58e2d75d177b-service-ca\") pod \"cluster-version-operator-5c965bbfc6-wgm2v\" (UID: \"4df1851e-8573-46d9-b076-58e2d75d177b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wgm2v" Feb 24 10:19:19 crc kubenswrapper[4698]: I0224 10:19:19.942078 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4df1851e-8573-46d9-b076-58e2d75d177b-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-wgm2v\" (UID: \"4df1851e-8573-46d9-b076-58e2d75d177b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wgm2v" Feb 24 10:19:19 crc kubenswrapper[4698]: I0224 10:19:19.942099 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/4df1851e-8573-46d9-b076-58e2d75d177b-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-wgm2v\" (UID: \"4df1851e-8573-46d9-b076-58e2d75d177b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wgm2v" Feb 24 10:19:19 crc kubenswrapper[4698]: I0224 10:19:19.942219 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4df1851e-8573-46d9-b076-58e2d75d177b-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-wgm2v\" (UID: \"4df1851e-8573-46d9-b076-58e2d75d177b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wgm2v" Feb 24 10:19:20 crc kubenswrapper[4698]: I0224 10:19:20.043332 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/4df1851e-8573-46d9-b076-58e2d75d177b-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-wgm2v\" (UID: \"4df1851e-8573-46d9-b076-58e2d75d177b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wgm2v" Feb 24 10:19:20 crc kubenswrapper[4698]: I0224 10:19:20.043476 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/4df1851e-8573-46d9-b076-58e2d75d177b-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-wgm2v\" (UID: \"4df1851e-8573-46d9-b076-58e2d75d177b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wgm2v" Feb 24 10:19:20 crc kubenswrapper[4698]: I0224 10:19:20.043555 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4df1851e-8573-46d9-b076-58e2d75d177b-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-wgm2v\" (UID: \"4df1851e-8573-46d9-b076-58e2d75d177b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wgm2v" Feb 24 10:19:20 crc kubenswrapper[4698]: I0224 10:19:20.043605 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/4df1851e-8573-46d9-b076-58e2d75d177b-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-wgm2v\" (UID: \"4df1851e-8573-46d9-b076-58e2d75d177b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wgm2v" Feb 24 10:19:20 crc kubenswrapper[4698]: I0224 10:19:20.043653 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4df1851e-8573-46d9-b076-58e2d75d177b-service-ca\") pod \"cluster-version-operator-5c965bbfc6-wgm2v\" (UID: \"4df1851e-8573-46d9-b076-58e2d75d177b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wgm2v" Feb 24 10:19:20 crc kubenswrapper[4698]: I0224 10:19:20.043704 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/4df1851e-8573-46d9-b076-58e2d75d177b-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-wgm2v\" (UID: \"4df1851e-8573-46d9-b076-58e2d75d177b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wgm2v" Feb 24 10:19:20 crc kubenswrapper[4698]: I0224 10:19:20.043835 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4df1851e-8573-46d9-b076-58e2d75d177b-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-wgm2v\" (UID: \"4df1851e-8573-46d9-b076-58e2d75d177b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wgm2v" Feb 24 10:19:20 crc kubenswrapper[4698]: I0224 10:19:20.046998 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4df1851e-8573-46d9-b076-58e2d75d177b-service-ca\") pod \"cluster-version-operator-5c965bbfc6-wgm2v\" (UID: \"4df1851e-8573-46d9-b076-58e2d75d177b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wgm2v" Feb 24 10:19:20 crc kubenswrapper[4698]: I0224 10:19:20.054914 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4df1851e-8573-46d9-b076-58e2d75d177b-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-wgm2v\" (UID: \"4df1851e-8573-46d9-b076-58e2d75d177b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wgm2v" Feb 24 10:19:20 crc kubenswrapper[4698]: I0224 10:19:20.074554 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4df1851e-8573-46d9-b076-58e2d75d177b-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-wgm2v\" (UID: \"4df1851e-8573-46d9-b076-58e2d75d177b\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wgm2v" Feb 24 10:19:20 crc kubenswrapper[4698]: I0224 10:19:20.116070 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wgm2v" Feb 24 10:19:20 crc kubenswrapper[4698]: I0224 10:19:20.550190 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wgm2v" event={"ID":"4df1851e-8573-46d9-b076-58e2d75d177b","Type":"ContainerStarted","Data":"b84b7106dc6d338cc09ed6ecb8fb1c4ad88badc3c6e8de5e664ea629dcd2355d"} Feb 24 10:19:20 crc kubenswrapper[4698]: I0224 10:19:20.550575 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wgm2v" event={"ID":"4df1851e-8573-46d9-b076-58e2d75d177b","Type":"ContainerStarted","Data":"df7dc5f97a9160189122a1d32525308ac8cdd9cff0c641628af698f7b6934e8c"} Feb 24 10:19:20 crc kubenswrapper[4698]: I0224 10:19:20.567530 4698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-wgm2v" podStartSLOduration=142.567499944 podStartE2EDuration="2m22.567499944s" podCreationTimestamp="2026-02-24 10:16:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:19:20.564525103 +0000 UTC m=+185.678139414" watchObservedRunningTime="2026-02-24 10:19:20.567499944 +0000 UTC m=+185.681114225" Feb 24 10:19:20 crc kubenswrapper[4698]: I0224 10:19:20.614439 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpnnm" Feb 24 10:19:20 crc kubenswrapper[4698]: I0224 10:19:20.614501 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:19:20 crc kubenswrapper[4698]: E0224 10:19:20.614554 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpnnm" podUID="17a1338b-6385-4795-9397-74316d6599d9" Feb 24 10:19:20 crc kubenswrapper[4698]: E0224 10:19:20.614642 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:19:20 crc kubenswrapper[4698]: I0224 10:19:20.663267 4698 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Feb 24 10:19:20 crc kubenswrapper[4698]: I0224 10:19:20.673962 4698 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 24 10:19:20 crc kubenswrapper[4698]: E0224 10:19:20.753392 4698 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 24 10:19:21 crc kubenswrapper[4698]: I0224 10:19:21.615505 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:19:21 crc kubenswrapper[4698]: I0224 10:19:21.615610 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:19:21 crc kubenswrapper[4698]: E0224 10:19:21.615787 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:19:21 crc kubenswrapper[4698]: E0224 10:19:21.616102 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:19:22 crc kubenswrapper[4698]: I0224 10:19:22.614081 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpnnm" Feb 24 10:19:22 crc kubenswrapper[4698]: I0224 10:19:22.614171 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:19:22 crc kubenswrapper[4698]: E0224 10:19:22.615309 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpnnm" podUID="17a1338b-6385-4795-9397-74316d6599d9" Feb 24 10:19:22 crc kubenswrapper[4698]: E0224 10:19:22.615437 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:19:23 crc kubenswrapper[4698]: I0224 10:19:23.613715 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:19:23 crc kubenswrapper[4698]: I0224 10:19:23.613715 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:19:23 crc kubenswrapper[4698]: E0224 10:19:23.613895 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:19:23 crc kubenswrapper[4698]: E0224 10:19:23.613962 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:19:24 crc kubenswrapper[4698]: I0224 10:19:24.613728 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:19:24 crc kubenswrapper[4698]: I0224 10:19:24.613836 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpnnm" Feb 24 10:19:24 crc kubenswrapper[4698]: E0224 10:19:24.613921 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:19:24 crc kubenswrapper[4698]: E0224 10:19:24.614062 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpnnm" podUID="17a1338b-6385-4795-9397-74316d6599d9" Feb 24 10:19:25 crc kubenswrapper[4698]: I0224 10:19:25.569507 4698 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7mbk6_17dd9ce8-b1ca-4810-85fe-9775919eb4b5/kube-multus/1.log" Feb 24 10:19:25 crc kubenswrapper[4698]: I0224 10:19:25.570088 4698 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7mbk6_17dd9ce8-b1ca-4810-85fe-9775919eb4b5/kube-multus/0.log" Feb 24 10:19:25 crc kubenswrapper[4698]: I0224 10:19:25.570144 4698 generic.go:334] "Generic (PLEG): container finished" podID="17dd9ce8-b1ca-4810-85fe-9775919eb4b5" containerID="26cc85a7a79119a1df0de0f47a3098d7417118ce0da5b300f453a3d8c4f351a7" exitCode=1 Feb 24 10:19:25 crc kubenswrapper[4698]: I0224 10:19:25.570178 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7mbk6" event={"ID":"17dd9ce8-b1ca-4810-85fe-9775919eb4b5","Type":"ContainerDied","Data":"26cc85a7a79119a1df0de0f47a3098d7417118ce0da5b300f453a3d8c4f351a7"} Feb 24 10:19:25 crc kubenswrapper[4698]: I0224 10:19:25.570214 4698 scope.go:117] "RemoveContainer" containerID="ac059400b5a17e1f1dc36d2fe35b5c8ace2dad5326f3933873eae644e1786c54" Feb 24 10:19:25 crc kubenswrapper[4698]: I0224 10:19:25.570775 4698 scope.go:117] "RemoveContainer" containerID="26cc85a7a79119a1df0de0f47a3098d7417118ce0da5b300f453a3d8c4f351a7" Feb 24 10:19:25 crc kubenswrapper[4698]: E0224 10:19:25.571138 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-7mbk6_openshift-multus(17dd9ce8-b1ca-4810-85fe-9775919eb4b5)\"" pod="openshift-multus/multus-7mbk6" podUID="17dd9ce8-b1ca-4810-85fe-9775919eb4b5" Feb 24 10:19:25 crc kubenswrapper[4698]: I0224 10:19:25.615911 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:19:25 crc kubenswrapper[4698]: E0224 10:19:25.616016 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:19:25 crc kubenswrapper[4698]: I0224 10:19:25.616319 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:19:25 crc kubenswrapper[4698]: E0224 10:19:25.616383 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:19:25 crc kubenswrapper[4698]: E0224 10:19:25.753737 4698 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 24 10:19:26 crc kubenswrapper[4698]: I0224 10:19:26.614736 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpnnm" Feb 24 10:19:26 crc kubenswrapper[4698]: I0224 10:19:26.614796 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:19:26 crc kubenswrapper[4698]: E0224 10:19:26.615378 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpnnm" podUID="17a1338b-6385-4795-9397-74316d6599d9" Feb 24 10:19:26 crc kubenswrapper[4698]: E0224 10:19:26.615532 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:19:27 crc kubenswrapper[4698]: I0224 10:19:27.580624 4698 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7mbk6_17dd9ce8-b1ca-4810-85fe-9775919eb4b5/kube-multus/1.log" Feb 24 10:19:27 crc kubenswrapper[4698]: I0224 10:19:27.614441 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:19:27 crc kubenswrapper[4698]: I0224 10:19:27.614486 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:19:27 crc kubenswrapper[4698]: E0224 10:19:27.614686 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:19:27 crc kubenswrapper[4698]: E0224 10:19:27.614825 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:19:28 crc kubenswrapper[4698]: I0224 10:19:28.614421 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:19:28 crc kubenswrapper[4698]: I0224 10:19:28.614452 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpnnm" Feb 24 10:19:28 crc kubenswrapper[4698]: E0224 10:19:28.614562 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:19:28 crc kubenswrapper[4698]: E0224 10:19:28.614682 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpnnm" podUID="17a1338b-6385-4795-9397-74316d6599d9" Feb 24 10:19:29 crc kubenswrapper[4698]: I0224 10:19:29.614383 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:19:29 crc kubenswrapper[4698]: I0224 10:19:29.614428 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:19:29 crc kubenswrapper[4698]: E0224 10:19:29.614629 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:19:29 crc kubenswrapper[4698]: E0224 10:19:29.614735 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:19:29 crc kubenswrapper[4698]: I0224 10:19:29.615548 4698 scope.go:117] "RemoveContainer" containerID="b4f0637ffd869edc84aea294e257ec525bede2fdb6f95377ebe6bf3fb1033d71" Feb 24 10:19:29 crc kubenswrapper[4698]: E0224 10:19:29.615728 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-mgh7p_openshift-ovn-kubernetes(066df704-6981-4770-a647-df52a0da50a0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" podUID="066df704-6981-4770-a647-df52a0da50a0" Feb 24 10:19:30 crc kubenswrapper[4698]: I0224 10:19:30.614530 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpnnm" Feb 24 10:19:30 crc kubenswrapper[4698]: I0224 10:19:30.614564 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:19:30 crc kubenswrapper[4698]: E0224 10:19:30.614793 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpnnm" podUID="17a1338b-6385-4795-9397-74316d6599d9" Feb 24 10:19:30 crc kubenswrapper[4698]: E0224 10:19:30.615017 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:19:30 crc kubenswrapper[4698]: E0224 10:19:30.755189 4698 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 24 10:19:31 crc kubenswrapper[4698]: I0224 10:19:31.614440 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:19:31 crc kubenswrapper[4698]: I0224 10:19:31.614490 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:19:31 crc kubenswrapper[4698]: E0224 10:19:31.614663 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:19:31 crc kubenswrapper[4698]: E0224 10:19:31.614817 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:19:32 crc kubenswrapper[4698]: I0224 10:19:32.614215 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpnnm" Feb 24 10:19:32 crc kubenswrapper[4698]: I0224 10:19:32.614236 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:19:32 crc kubenswrapper[4698]: E0224 10:19:32.614456 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpnnm" podUID="17a1338b-6385-4795-9397-74316d6599d9" Feb 24 10:19:32 crc kubenswrapper[4698]: E0224 10:19:32.614614 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:19:33 crc kubenswrapper[4698]: I0224 10:19:33.614523 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:19:33 crc kubenswrapper[4698]: I0224 10:19:33.614584 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:19:33 crc kubenswrapper[4698]: E0224 10:19:33.614650 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:19:33 crc kubenswrapper[4698]: E0224 10:19:33.614776 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:19:34 crc kubenswrapper[4698]: I0224 10:19:34.614663 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpnnm" Feb 24 10:19:34 crc kubenswrapper[4698]: I0224 10:19:34.614697 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:19:34 crc kubenswrapper[4698]: E0224 10:19:34.614787 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpnnm" podUID="17a1338b-6385-4795-9397-74316d6599d9" Feb 24 10:19:34 crc kubenswrapper[4698]: E0224 10:19:34.614994 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:19:35 crc kubenswrapper[4698]: I0224 10:19:35.614577 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:19:35 crc kubenswrapper[4698]: I0224 10:19:35.614577 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:19:35 crc kubenswrapper[4698]: E0224 10:19:35.618294 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:19:35 crc kubenswrapper[4698]: E0224 10:19:35.618739 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:19:35 crc kubenswrapper[4698]: E0224 10:19:35.756811 4698 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 24 10:19:36 crc kubenswrapper[4698]: I0224 10:19:36.613998 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:19:36 crc kubenswrapper[4698]: E0224 10:19:36.614305 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:19:36 crc kubenswrapper[4698]: I0224 10:19:36.615155 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpnnm" Feb 24 10:19:36 crc kubenswrapper[4698]: E0224 10:19:36.616192 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpnnm" podUID="17a1338b-6385-4795-9397-74316d6599d9" Feb 24 10:19:37 crc kubenswrapper[4698]: I0224 10:19:37.614397 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:19:37 crc kubenswrapper[4698]: I0224 10:19:37.614483 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:19:37 crc kubenswrapper[4698]: E0224 10:19:37.614596 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:19:37 crc kubenswrapper[4698]: E0224 10:19:37.615006 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:19:38 crc kubenswrapper[4698]: I0224 10:19:38.614189 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpnnm" Feb 24 10:19:38 crc kubenswrapper[4698]: I0224 10:19:38.614708 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:19:38 crc kubenswrapper[4698]: E0224 10:19:38.614909 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpnnm" podUID="17a1338b-6385-4795-9397-74316d6599d9" Feb 24 10:19:38 crc kubenswrapper[4698]: E0224 10:19:38.615038 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:19:39 crc kubenswrapper[4698]: I0224 10:19:39.613905 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:19:39 crc kubenswrapper[4698]: I0224 10:19:39.613912 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:19:39 crc kubenswrapper[4698]: E0224 10:19:39.614159 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:19:39 crc kubenswrapper[4698]: E0224 10:19:39.614299 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:19:40 crc kubenswrapper[4698]: I0224 10:19:40.614299 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpnnm" Feb 24 10:19:40 crc kubenswrapper[4698]: I0224 10:19:40.614393 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:19:40 crc kubenswrapper[4698]: E0224 10:19:40.614573 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpnnm" podUID="17a1338b-6385-4795-9397-74316d6599d9" Feb 24 10:19:40 crc kubenswrapper[4698]: E0224 10:19:40.615044 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:19:40 crc kubenswrapper[4698]: I0224 10:19:40.615335 4698 scope.go:117] "RemoveContainer" containerID="26cc85a7a79119a1df0de0f47a3098d7417118ce0da5b300f453a3d8c4f351a7" Feb 24 10:19:40 crc kubenswrapper[4698]: E0224 10:19:40.758029 4698 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 24 10:19:41 crc kubenswrapper[4698]: I0224 10:19:41.613899 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:19:41 crc kubenswrapper[4698]: I0224 10:19:41.613960 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:19:41 crc kubenswrapper[4698]: E0224 10:19:41.614643 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:19:41 crc kubenswrapper[4698]: E0224 10:19:41.614737 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:19:41 crc kubenswrapper[4698]: I0224 10:19:41.633600 4698 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7mbk6_17dd9ce8-b1ca-4810-85fe-9775919eb4b5/kube-multus/1.log" Feb 24 10:19:41 crc kubenswrapper[4698]: I0224 10:19:41.633746 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7mbk6" event={"ID":"17dd9ce8-b1ca-4810-85fe-9775919eb4b5","Type":"ContainerStarted","Data":"ab364baedbeb66518d2c61a0989a799a3a60377047595973f394b87edd9b060a"} Feb 24 10:19:42 crc kubenswrapper[4698]: I0224 10:19:42.614730 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:19:42 crc kubenswrapper[4698]: I0224 10:19:42.614772 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpnnm" Feb 24 10:19:42 crc kubenswrapper[4698]: E0224 10:19:42.615237 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:19:42 crc kubenswrapper[4698]: E0224 10:19:42.615467 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpnnm" podUID="17a1338b-6385-4795-9397-74316d6599d9" Feb 24 10:19:43 crc kubenswrapper[4698]: I0224 10:19:43.614069 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:19:43 crc kubenswrapper[4698]: E0224 10:19:43.614205 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:19:43 crc kubenswrapper[4698]: I0224 10:19:43.614372 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:19:43 crc kubenswrapper[4698]: E0224 10:19:43.614562 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:19:43 crc kubenswrapper[4698]: I0224 10:19:43.615835 4698 scope.go:117] "RemoveContainer" containerID="b4f0637ffd869edc84aea294e257ec525bede2fdb6f95377ebe6bf3fb1033d71" Feb 24 10:19:44 crc kubenswrapper[4698]: I0224 10:19:44.420565 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-rpnnm"] Feb 24 10:19:44 crc kubenswrapper[4698]: I0224 10:19:44.420652 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpnnm" Feb 24 10:19:44 crc kubenswrapper[4698]: E0224 10:19:44.420734 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpnnm" podUID="17a1338b-6385-4795-9397-74316d6599d9" Feb 24 10:19:44 crc kubenswrapper[4698]: I0224 10:19:44.614301 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:19:44 crc kubenswrapper[4698]: E0224 10:19:44.614513 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:19:44 crc kubenswrapper[4698]: I0224 10:19:44.644214 4698 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mgh7p_066df704-6981-4770-a647-df52a0da50a0/ovnkube-controller/3.log" Feb 24 10:19:44 crc kubenswrapper[4698]: I0224 10:19:44.646971 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" event={"ID":"066df704-6981-4770-a647-df52a0da50a0","Type":"ContainerStarted","Data":"1058035c4f9ec53ded52f7e95037f92da16967a87ba1ef415eb2df5ed366da4c"} Feb 24 10:19:44 crc kubenswrapper[4698]: I0224 10:19:44.647518 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" Feb 24 10:19:44 crc kubenswrapper[4698]: I0224 10:19:44.678512 4698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" podStartSLOduration=166.678490175 podStartE2EDuration="2m46.678490175s" podCreationTimestamp="2026-02-24 10:16:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:19:44.676100779 +0000 UTC m=+209.789715040" watchObservedRunningTime="2026-02-24 10:19:44.678490175 +0000 UTC m=+209.792104426" Feb 24 10:19:45 crc kubenswrapper[4698]: I0224 10:19:45.614740 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:19:45 crc kubenswrapper[4698]: I0224 10:19:45.614777 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:19:45 crc kubenswrapper[4698]: E0224 10:19:45.615950 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:19:45 crc kubenswrapper[4698]: E0224 10:19:45.616036 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:19:45 crc kubenswrapper[4698]: E0224 10:19:45.758988 4698 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 24 10:19:46 crc kubenswrapper[4698]: I0224 10:19:46.614689 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:19:46 crc kubenswrapper[4698]: I0224 10:19:46.614789 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpnnm" Feb 24 10:19:46 crc kubenswrapper[4698]: E0224 10:19:46.614954 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:19:46 crc kubenswrapper[4698]: E0224 10:19:46.615094 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpnnm" podUID="17a1338b-6385-4795-9397-74316d6599d9" Feb 24 10:19:47 crc kubenswrapper[4698]: I0224 10:19:47.614713 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:19:47 crc kubenswrapper[4698]: I0224 10:19:47.614749 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:19:47 crc kubenswrapper[4698]: E0224 10:19:47.614916 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:19:47 crc kubenswrapper[4698]: E0224 10:19:47.615043 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:19:48 crc kubenswrapper[4698]: I0224 10:19:48.613868 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpnnm" Feb 24 10:19:48 crc kubenswrapper[4698]: I0224 10:19:48.613911 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:19:48 crc kubenswrapper[4698]: E0224 10:19:48.614055 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpnnm" podUID="17a1338b-6385-4795-9397-74316d6599d9" Feb 24 10:19:48 crc kubenswrapper[4698]: E0224 10:19:48.614150 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:19:49 crc kubenswrapper[4698]: I0224 10:19:49.613771 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:19:49 crc kubenswrapper[4698]: I0224 10:19:49.613859 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:19:49 crc kubenswrapper[4698]: E0224 10:19:49.613884 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 24 10:19:49 crc kubenswrapper[4698]: E0224 10:19:49.614045 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 24 10:19:50 crc kubenswrapper[4698]: I0224 10:19:50.614792 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpnnm" Feb 24 10:19:50 crc kubenswrapper[4698]: I0224 10:19:50.614851 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:19:50 crc kubenswrapper[4698]: E0224 10:19:50.615009 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rpnnm" podUID="17a1338b-6385-4795-9397-74316d6599d9" Feb 24 10:19:50 crc kubenswrapper[4698]: E0224 10:19:50.615139 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 24 10:19:51 crc kubenswrapper[4698]: I0224 10:19:51.614165 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:19:51 crc kubenswrapper[4698]: I0224 10:19:51.614983 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:19:51 crc kubenswrapper[4698]: I0224 10:19:51.617393 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 24 10:19:51 crc kubenswrapper[4698]: I0224 10:19:51.619045 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 24 10:19:51 crc kubenswrapper[4698]: I0224 10:19:51.619561 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 24 10:19:51 crc kubenswrapper[4698]: I0224 10:19:51.619563 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 24 10:19:52 crc kubenswrapper[4698]: I0224 10:19:52.196724 4698 patch_prober.go:28] interesting pod/machine-config-daemon-nn578 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 10:19:52 crc kubenswrapper[4698]: I0224 10:19:52.196820 4698 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nn578" podUID="b4ee0bb1-125d-4852-a54d-7dadf6177545" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 10:19:52 crc kubenswrapper[4698]: I0224 10:19:52.599504 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" Feb 24 10:19:52 crc kubenswrapper[4698]: I0224 10:19:52.614509 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:19:52 crc kubenswrapper[4698]: I0224 10:19:52.614517 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpnnm" Feb 24 10:19:52 crc kubenswrapper[4698]: I0224 10:19:52.617285 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 24 10:19:52 crc kubenswrapper[4698]: I0224 10:19:52.617565 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 24 10:19:59 crc kubenswrapper[4698]: I0224 10:19:59.614844 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:19:59 crc kubenswrapper[4698]: E0224 10:19:59.615106 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:22:01.615064386 +0000 UTC m=+346.728678667 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:19:59 crc kubenswrapper[4698]: I0224 10:19:59.615245 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:19:59 crc kubenswrapper[4698]: I0224 10:19:59.615310 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:19:59 crc kubenswrapper[4698]: I0224 10:19:59.627319 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:19:59 crc kubenswrapper[4698]: I0224 10:19:59.627852 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:19:59 crc kubenswrapper[4698]: I0224 10:19:59.715751 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:19:59 crc kubenswrapper[4698]: I0224 10:19:59.715820 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:19:59 crc kubenswrapper[4698]: I0224 10:19:59.720357 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:19:59 crc kubenswrapper[4698]: I0224 10:19:59.720686 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:19:59 crc kubenswrapper[4698]: I0224 10:19:59.728371 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 24 10:19:59 crc kubenswrapper[4698]: I0224 10:19:59.735109 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:19:59 crc kubenswrapper[4698]: I0224 10:19:59.831946 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 24 10:20:00 crc kubenswrapper[4698]: W0224 10:20:00.056629 4698 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-93f45c41f68c78a931dc9d646dd09892b8d674803e982703d7846acbf8bde046 WatchSource:0}: Error finding container 93f45c41f68c78a931dc9d646dd09892b8d674803e982703d7846acbf8bde046: Status 404 returned error can't find the container with id 93f45c41f68c78a931dc9d646dd09892b8d674803e982703d7846acbf8bde046 Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.247346 4698 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.296500 4698 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-vb7wk"] Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.301505 4698 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-w8bmq"] Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.301837 4698 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-fq25r"] Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.302325 4698 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-bfmpl"] Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.302489 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-vb7wk" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.302854 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-w8bmq" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.303282 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-fq25r" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.303654 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bfmpl" Feb 24 10:20:00 crc kubenswrapper[4698]: W0224 10:20:00.313016 4698 reflector.go:561] object-"openshift-controller-manager"/"client-ca": failed to list *v1.ConfigMap: configmaps "client-ca" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Feb 24 10:20:00 crc kubenswrapper[4698]: E0224 10:20:00.313077 4698 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"client-ca\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"client-ca\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 24 10:20:00 crc kubenswrapper[4698]: W0224 10:20:00.313162 4698 reflector.go:561] object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c": failed to list *v1.Secret: secrets "openshift-controller-manager-sa-dockercfg-msq4c" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Feb 24 10:20:00 crc kubenswrapper[4698]: E0224 10:20:00.313183 4698 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"openshift-controller-manager-sa-dockercfg-msq4c\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"openshift-controller-manager-sa-dockercfg-msq4c\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 24 10:20:00 crc kubenswrapper[4698]: W0224 10:20:00.313926 4698 reflector.go:561] object-"openshift-controller-manager"/"openshift-global-ca": failed to list *v1.ConfigMap: configmaps "openshift-global-ca" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Feb 24 10:20:00 crc kubenswrapper[4698]: E0224 10:20:00.313954 4698 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"openshift-global-ca\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-global-ca\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 24 10:20:00 crc kubenswrapper[4698]: W0224 10:20:00.314057 4698 reflector.go:561] object-"openshift-controller-manager"/"serving-cert": failed to list *v1.Secret: secrets "serving-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Feb 24 10:20:00 crc kubenswrapper[4698]: E0224 10:20:00.314078 4698 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"serving-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"serving-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 24 10:20:00 crc kubenswrapper[4698]: W0224 10:20:00.314144 4698 reflector.go:561] object-"openshift-route-controller-manager"/"config": failed to list *v1.ConfigMap: configmaps "config" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-route-controller-manager": no relationship found between node 'crc' and this object Feb 24 10:20:00 crc kubenswrapper[4698]: E0224 10:20:00.314164 4698 reflector.go:158] "Unhandled Error" err="object-\"openshift-route-controller-manager\"/\"config\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"config\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-route-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 24 10:20:00 crc kubenswrapper[4698]: W0224 10:20:00.316391 4698 reflector.go:561] object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2": failed to list *v1.Secret: secrets "route-controller-manager-sa-dockercfg-h2zr2" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-route-controller-manager": no relationship found between node 'crc' and this object Feb 24 10:20:00 crc kubenswrapper[4698]: E0224 10:20:00.316426 4698 reflector.go:158] "Unhandled Error" err="object-\"openshift-route-controller-manager\"/\"route-controller-manager-sa-dockercfg-h2zr2\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"route-controller-manager-sa-dockercfg-h2zr2\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-route-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.316482 4698 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-dkf4t"] Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.317169 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dkf4t" Feb 24 10:20:00 crc kubenswrapper[4698]: W0224 10:20:00.320455 4698 reflector.go:561] object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7": failed to list *v1.Secret: secrets "machine-api-operator-dockercfg-mfbb7" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-machine-api": no relationship found between node 'crc' and this object Feb 24 10:20:00 crc kubenswrapper[4698]: E0224 10:20:00.320506 4698 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-api\"/\"machine-api-operator-dockercfg-mfbb7\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"machine-api-operator-dockercfg-mfbb7\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-machine-api\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 24 10:20:00 crc kubenswrapper[4698]: W0224 10:20:00.320597 4698 reflector.go:561] object-"openshift-machine-api"/"machine-api-operator-tls": failed to list *v1.Secret: secrets "machine-api-operator-tls" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-machine-api": no relationship found between node 'crc' and this object Feb 24 10:20:00 crc kubenswrapper[4698]: E0224 10:20:00.320620 4698 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-api\"/\"machine-api-operator-tls\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"machine-api-operator-tls\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-machine-api\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 24 10:20:00 crc kubenswrapper[4698]: W0224 10:20:00.320707 4698 reflector.go:561] object-"openshift-machine-api"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-machine-api": no relationship found between node 'crc' and this object Feb 24 10:20:00 crc kubenswrapper[4698]: W0224 10:20:00.320714 4698 reflector.go:561] object-"openshift-authentication-operator"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-authentication-operator": no relationship found between node 'crc' and this object Feb 24 10:20:00 crc kubenswrapper[4698]: E0224 10:20:00.320728 4698 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-api\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-machine-api\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 24 10:20:00 crc kubenswrapper[4698]: E0224 10:20:00.320757 4698 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication-operator\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-authentication-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 24 10:20:00 crc kubenswrapper[4698]: W0224 10:20:00.320869 4698 reflector.go:561] object-"openshift-authentication-operator"/"serving-cert": failed to list *v1.Secret: secrets "serving-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-authentication-operator": no relationship found between node 'crc' and this object Feb 24 10:20:00 crc kubenswrapper[4698]: W0224 10:20:00.320878 4698 reflector.go:561] object-"openshift-route-controller-manager"/"client-ca": failed to list *v1.ConfigMap: configmaps "client-ca" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-route-controller-manager": no relationship found between node 'crc' and this object Feb 24 10:20:00 crc kubenswrapper[4698]: W0224 10:20:00.320918 4698 reflector.go:561] object-"openshift-machine-api"/"machine-api-operator-images": failed to list *v1.ConfigMap: configmaps "machine-api-operator-images" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-machine-api": no relationship found between node 'crc' and this object Feb 24 10:20:00 crc kubenswrapper[4698]: E0224 10:20:00.320930 4698 reflector.go:158] "Unhandled Error" err="object-\"openshift-route-controller-manager\"/\"client-ca\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"client-ca\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-route-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 24 10:20:00 crc kubenswrapper[4698]: W0224 10:20:00.320943 4698 reflector.go:561] object-"openshift-machine-api"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-machine-api": no relationship found between node 'crc' and this object Feb 24 10:20:00 crc kubenswrapper[4698]: E0224 10:20:00.320945 4698 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-api\"/\"machine-api-operator-images\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"machine-api-operator-images\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-machine-api\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 24 10:20:00 crc kubenswrapper[4698]: E0224 10:20:00.320968 4698 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-api\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-machine-api\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 24 10:20:00 crc kubenswrapper[4698]: W0224 10:20:00.320950 4698 reflector.go:561] object-"openshift-authentication-operator"/"trusted-ca-bundle": failed to list *v1.ConfigMap: configmaps "trusted-ca-bundle" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-authentication-operator": no relationship found between node 'crc' and this object Feb 24 10:20:00 crc kubenswrapper[4698]: W0224 10:20:00.321042 4698 reflector.go:561] object-"openshift-route-controller-manager"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-route-controller-manager": no relationship found between node 'crc' and this object Feb 24 10:20:00 crc kubenswrapper[4698]: E0224 10:20:00.321072 4698 reflector.go:158] "Unhandled Error" err="object-\"openshift-route-controller-manager\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-route-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 24 10:20:00 crc kubenswrapper[4698]: W0224 10:20:00.321047 4698 reflector.go:561] object-"openshift-route-controller-manager"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-route-controller-manager": no relationship found between node 'crc' and this object Feb 24 10:20:00 crc kubenswrapper[4698]: E0224 10:20:00.321106 4698 reflector.go:158] "Unhandled Error" err="object-\"openshift-route-controller-manager\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-route-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 24 10:20:00 crc kubenswrapper[4698]: W0224 10:20:00.320888 4698 reflector.go:561] object-"openshift-authentication-operator"/"service-ca-bundle": failed to list *v1.ConfigMap: configmaps "service-ca-bundle" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-authentication-operator": no relationship found between node 'crc' and this object Feb 24 10:20:00 crc kubenswrapper[4698]: E0224 10:20:00.321130 4698 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication-operator\"/\"service-ca-bundle\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"service-ca-bundle\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-authentication-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 24 10:20:00 crc kubenswrapper[4698]: E0224 10:20:00.320887 4698 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication-operator\"/\"serving-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"serving-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-authentication-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 24 10:20:00 crc kubenswrapper[4698]: W0224 10:20:00.321147 4698 reflector.go:561] object-"openshift-authentication-operator"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-authentication-operator": no relationship found between node 'crc' and this object Feb 24 10:20:00 crc kubenswrapper[4698]: E0224 10:20:00.321062 4698 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication-operator\"/\"trusted-ca-bundle\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"trusted-ca-bundle\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-authentication-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 24 10:20:00 crc kubenswrapper[4698]: E0224 10:20:00.321195 4698 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication-operator\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-authentication-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 24 10:20:00 crc kubenswrapper[4698]: W0224 10:20:00.320920 4698 reflector.go:561] object-"openshift-authentication-operator"/"authentication-operator-config": failed to list *v1.ConfigMap: configmaps "authentication-operator-config" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-authentication-operator": no relationship found between node 'crc' and this object Feb 24 10:20:00 crc kubenswrapper[4698]: E0224 10:20:00.321347 4698 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication-operator\"/\"authentication-operator-config\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"authentication-operator-config\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-authentication-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.321423 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zk9sk\" (UniqueName: \"kubernetes.io/projected/59c9844c-00fe-42cd-add6-9ab528da273d-kube-api-access-zk9sk\") pod \"authentication-operator-69f744f599-fq25r\" (UID: \"59c9844c-00fe-42cd-add6-9ab528da273d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fq25r" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.321464 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/21ede3a0-b26f-4e29-8a13-86d877b60519-serving-cert\") pod \"apiserver-7bbb656c7d-dkf4t\" (UID: \"21ede3a0-b26f-4e29-8a13-86d877b60519\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dkf4t" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.321485 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fccccb67-888f-4a34-a701-61926e9819a6-config\") pod \"machine-api-operator-5694c8668f-vb7wk\" (UID: \"fccccb67-888f-4a34-a701-61926e9819a6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vb7wk" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.321505 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h46gj\" (UniqueName: \"kubernetes.io/projected/38bdf14d-35ac-440b-9a16-9a4ddd53df34-kube-api-access-h46gj\") pod \"controller-manager-879f6c89f-w8bmq\" (UID: \"38bdf14d-35ac-440b-9a16-9a4ddd53df34\") " pod="openshift-controller-manager/controller-manager-879f6c89f-w8bmq" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.321527 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/21ede3a0-b26f-4e29-8a13-86d877b60519-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-dkf4t\" (UID: \"21ede3a0-b26f-4e29-8a13-86d877b60519\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dkf4t" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.321548 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/21ede3a0-b26f-4e29-8a13-86d877b60519-audit-dir\") pod \"apiserver-7bbb656c7d-dkf4t\" (UID: \"21ede3a0-b26f-4e29-8a13-86d877b60519\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dkf4t" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.321567 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/38bdf14d-35ac-440b-9a16-9a4ddd53df34-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-w8bmq\" (UID: \"38bdf14d-35ac-440b-9a16-9a4ddd53df34\") " pod="openshift-controller-manager/controller-manager-879f6c89f-w8bmq" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.321588 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/59c9844c-00fe-42cd-add6-9ab528da273d-service-ca-bundle\") pod \"authentication-operator-69f744f599-fq25r\" (UID: \"59c9844c-00fe-42cd-add6-9ab528da273d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fq25r" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.321611 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7141f48f-7462-4e1a-a90f-b8ff3b9a8d9f-client-ca\") pod \"route-controller-manager-6576b87f9c-bfmpl\" (UID: \"7141f48f-7462-4e1a-a90f-b8ff3b9a8d9f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bfmpl" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.321630 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kb89h\" (UniqueName: \"kubernetes.io/projected/7141f48f-7462-4e1a-a90f-b8ff3b9a8d9f-kube-api-access-kb89h\") pod \"route-controller-manager-6576b87f9c-bfmpl\" (UID: \"7141f48f-7462-4e1a-a90f-b8ff3b9a8d9f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bfmpl" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.321650 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/21ede3a0-b26f-4e29-8a13-86d877b60519-encryption-config\") pod \"apiserver-7bbb656c7d-dkf4t\" (UID: \"21ede3a0-b26f-4e29-8a13-86d877b60519\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dkf4t" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.321669 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7141f48f-7462-4e1a-a90f-b8ff3b9a8d9f-serving-cert\") pod \"route-controller-manager-6576b87f9c-bfmpl\" (UID: \"7141f48f-7462-4e1a-a90f-b8ff3b9a8d9f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bfmpl" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.321687 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/59c9844c-00fe-42cd-add6-9ab528da273d-serving-cert\") pod \"authentication-operator-69f744f599-fq25r\" (UID: \"59c9844c-00fe-42cd-add6-9ab528da273d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fq25r" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.321709 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7141f48f-7462-4e1a-a90f-b8ff3b9a8d9f-config\") pod \"route-controller-manager-6576b87f9c-bfmpl\" (UID: \"7141f48f-7462-4e1a-a90f-b8ff3b9a8d9f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bfmpl" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.321729 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/21ede3a0-b26f-4e29-8a13-86d877b60519-audit-policies\") pod \"apiserver-7bbb656c7d-dkf4t\" (UID: \"21ede3a0-b26f-4e29-8a13-86d877b60519\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dkf4t" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.321748 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/fccccb67-888f-4a34-a701-61926e9819a6-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-vb7wk\" (UID: \"fccccb67-888f-4a34-a701-61926e9819a6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vb7wk" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.321768 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/59c9844c-00fe-42cd-add6-9ab528da273d-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-fq25r\" (UID: \"59c9844c-00fe-42cd-add6-9ab528da273d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fq25r" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.321788 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/21ede3a0-b26f-4e29-8a13-86d877b60519-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-dkf4t\" (UID: \"21ede3a0-b26f-4e29-8a13-86d877b60519\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dkf4t" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.321806 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/fccccb67-888f-4a34-a701-61926e9819a6-images\") pod \"machine-api-operator-5694c8668f-vb7wk\" (UID: \"fccccb67-888f-4a34-a701-61926e9819a6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vb7wk" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.321862 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/38bdf14d-35ac-440b-9a16-9a4ddd53df34-serving-cert\") pod \"controller-manager-879f6c89f-w8bmq\" (UID: \"38bdf14d-35ac-440b-9a16-9a4ddd53df34\") " pod="openshift-controller-manager/controller-manager-879f6c89f-w8bmq" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.321883 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/38bdf14d-35ac-440b-9a16-9a4ddd53df34-client-ca\") pod \"controller-manager-879f6c89f-w8bmq\" (UID: \"38bdf14d-35ac-440b-9a16-9a4ddd53df34\") " pod="openshift-controller-manager/controller-manager-879f6c89f-w8bmq" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.321901 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptvs7\" (UniqueName: \"kubernetes.io/projected/fccccb67-888f-4a34-a701-61926e9819a6-kube-api-access-ptvs7\") pod \"machine-api-operator-5694c8668f-vb7wk\" (UID: \"fccccb67-888f-4a34-a701-61926e9819a6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vb7wk" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.321922 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38bdf14d-35ac-440b-9a16-9a4ddd53df34-config\") pod \"controller-manager-879f6c89f-w8bmq\" (UID: \"38bdf14d-35ac-440b-9a16-9a4ddd53df34\") " pod="openshift-controller-manager/controller-manager-879f6c89f-w8bmq" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.321940 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59c9844c-00fe-42cd-add6-9ab528da273d-config\") pod \"authentication-operator-69f744f599-fq25r\" (UID: \"59c9844c-00fe-42cd-add6-9ab528da273d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fq25r" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.321960 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/21ede3a0-b26f-4e29-8a13-86d877b60519-etcd-client\") pod \"apiserver-7bbb656c7d-dkf4t\" (UID: \"21ede3a0-b26f-4e29-8a13-86d877b60519\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dkf4t" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.321980 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s22m7\" (UniqueName: \"kubernetes.io/projected/21ede3a0-b26f-4e29-8a13-86d877b60519-kube-api-access-s22m7\") pod \"apiserver-7bbb656c7d-dkf4t\" (UID: \"21ede3a0-b26f-4e29-8a13-86d877b60519\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dkf4t" Feb 24 10:20:00 crc kubenswrapper[4698]: W0224 10:20:00.325137 4698 reflector.go:561] object-"openshift-controller-manager"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Feb 24 10:20:00 crc kubenswrapper[4698]: E0224 10:20:00.325221 4698 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 24 10:20:00 crc kubenswrapper[4698]: W0224 10:20:00.325372 4698 reflector.go:561] object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj": failed to list *v1.Secret: secrets "authentication-operator-dockercfg-mz9bj" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-authentication-operator": no relationship found between node 'crc' and this object Feb 24 10:20:00 crc kubenswrapper[4698]: E0224 10:20:00.325413 4698 reflector.go:158] "Unhandled Error" err="object-\"openshift-authentication-operator\"/\"authentication-operator-dockercfg-mz9bj\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"authentication-operator-dockercfg-mz9bj\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-authentication-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.325554 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 24 10:20:00 crc kubenswrapper[4698]: W0224 10:20:00.325962 4698 reflector.go:561] object-"openshift-machine-api"/"kube-rbac-proxy": failed to list *v1.ConfigMap: configmaps "kube-rbac-proxy" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-machine-api": no relationship found between node 'crc' and this object Feb 24 10:20:00 crc kubenswrapper[4698]: E0224 10:20:00.326006 4698 reflector.go:158] "Unhandled Error" err="object-\"openshift-machine-api\"/\"kube-rbac-proxy\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-rbac-proxy\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-machine-api\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 24 10:20:00 crc kubenswrapper[4698]: W0224 10:20:00.326096 4698 reflector.go:561] object-"openshift-controller-manager"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Feb 24 10:20:00 crc kubenswrapper[4698]: E0224 10:20:00.326149 4698 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 24 10:20:00 crc kubenswrapper[4698]: W0224 10:20:00.326395 4698 reflector.go:561] object-"openshift-route-controller-manager"/"serving-cert": failed to list *v1.Secret: secrets "serving-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-route-controller-manager": no relationship found between node 'crc' and this object Feb 24 10:20:00 crc kubenswrapper[4698]: E0224 10:20:00.326535 4698 reflector.go:158] "Unhandled Error" err="object-\"openshift-route-controller-manager\"/\"serving-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"serving-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-route-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 24 10:20:00 crc kubenswrapper[4698]: W0224 10:20:00.326632 4698 reflector.go:561] object-"openshift-controller-manager"/"config": failed to list *v1.ConfigMap: configmaps "config" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Feb 24 10:20:00 crc kubenswrapper[4698]: E0224 10:20:00.326665 4698 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"config\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"config\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.326712 4698 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jrgwq"] Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.327508 4698 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-v6qbj"] Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.327719 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jrgwq" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.328375 4698 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-lqzfp"] Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.329446 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-v6qbj" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.329683 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-lqzfp" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.331329 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.331377 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.331489 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.331578 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.331658 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.331770 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.331873 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.331963 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.332069 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.332103 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.335316 4698 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-clwmh"] Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.335787 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.335847 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-clwmh" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.335974 4698 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-qzmkf"] Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.336526 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-qzmkf" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.342300 4698 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-6bh9j"] Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.343162 4698 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-5ndrw"] Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.348756 4698 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-lcjcd"] Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.344601 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6bh9j" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.344750 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.348925 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.349069 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-5ndrw" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.348165 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.348216 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.348304 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.348363 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.348422 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.348547 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.348883 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.349785 4698 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-x8tds"] Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.350027 4698 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-snxxh"] Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.350084 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-x8tds" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.350112 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-lcjcd" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.350576 4698 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-hxxxs"] Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.350836 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-hxxxs" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.351089 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-snxxh" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.351297 4698 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-bmr2l"] Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.351759 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-bmr2l" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.354512 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.354594 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.354739 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.354750 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.355144 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.354518 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.377010 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.377246 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.377560 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.404707 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.405573 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.405680 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.405732 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.405816 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.406211 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.406382 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.406752 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.407222 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.409044 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.409214 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.409415 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.409448 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.409462 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.409800 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.409813 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.410013 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.410126 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.410665 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.410909 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.411228 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.411347 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.411438 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.412490 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.412511 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.412666 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.412822 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.412960 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.413120 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.413292 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.413357 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.413287 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.475222 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.475642 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.476114 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/21ede3a0-b26f-4e29-8a13-86d877b60519-etcd-client\") pod \"apiserver-7bbb656c7d-dkf4t\" (UID: \"21ede3a0-b26f-4e29-8a13-86d877b60519\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dkf4t" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.476158 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.476293 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s22m7\" (UniqueName: \"kubernetes.io/projected/21ede3a0-b26f-4e29-8a13-86d877b60519-kube-api-access-s22m7\") pod \"apiserver-7bbb656c7d-dkf4t\" (UID: \"21ede3a0-b26f-4e29-8a13-86d877b60519\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dkf4t" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.476362 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zk9sk\" (UniqueName: \"kubernetes.io/projected/59c9844c-00fe-42cd-add6-9ab528da273d-kube-api-access-zk9sk\") pod \"authentication-operator-69f744f599-fq25r\" (UID: \"59c9844c-00fe-42cd-add6-9ab528da273d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fq25r" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.476432 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a09fb76f-4291-4945-8bb0-15c478a35cbf-config\") pod \"machine-approver-56656f9798-6bh9j\" (UID: \"a09fb76f-4291-4945-8bb0-15c478a35cbf\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6bh9j" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.476466 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/a09fb76f-4291-4945-8bb0-15c478a35cbf-machine-approver-tls\") pod \"machine-approver-56656f9798-6bh9j\" (UID: \"a09fb76f-4291-4945-8bb0-15c478a35cbf\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6bh9j" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.476577 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.476752 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.476670 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/21ede3a0-b26f-4e29-8a13-86d877b60519-serving-cert\") pod \"apiserver-7bbb656c7d-dkf4t\" (UID: \"21ede3a0-b26f-4e29-8a13-86d877b60519\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dkf4t" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.476818 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fccccb67-888f-4a34-a701-61926e9819a6-config\") pod \"machine-api-operator-5694c8668f-vb7wk\" (UID: \"fccccb67-888f-4a34-a701-61926e9819a6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vb7wk" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.476827 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.476847 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h46gj\" (UniqueName: \"kubernetes.io/projected/38bdf14d-35ac-440b-9a16-9a4ddd53df34-kube-api-access-h46gj\") pod \"controller-manager-879f6c89f-w8bmq\" (UID: \"38bdf14d-35ac-440b-9a16-9a4ddd53df34\") " pod="openshift-controller-manager/controller-manager-879f6c89f-w8bmq" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.476957 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.477561 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.477705 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.477877 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.477906 4698 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-z42jf"] Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.478275 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.478579 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-z42jf" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.478841 4698 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xcfhs"] Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.479233 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xcfhs" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.484048 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.484576 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.484737 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.485308 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.485601 4698 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-trk7h"] Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.485948 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.486043 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/21ede3a0-b26f-4e29-8a13-86d877b60519-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-dkf4t\" (UID: \"21ede3a0-b26f-4e29-8a13-86d877b60519\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dkf4t" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.486983 4698 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29532135-2wtjd"] Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.487447 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29532135-2wtjd" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.487447 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-trk7h" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.487951 4698 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5tdjl"] Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.488184 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.488298 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/21ede3a0-b26f-4e29-8a13-86d877b60519-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-dkf4t\" (UID: \"21ede3a0-b26f-4e29-8a13-86d877b60519\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dkf4t" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.488472 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/21ede3a0-b26f-4e29-8a13-86d877b60519-etcd-client\") pod \"apiserver-7bbb656c7d-dkf4t\" (UID: \"21ede3a0-b26f-4e29-8a13-86d877b60519\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dkf4t" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.488779 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/21ede3a0-b26f-4e29-8a13-86d877b60519-audit-dir\") pod \"apiserver-7bbb656c7d-dkf4t\" (UID: \"21ede3a0-b26f-4e29-8a13-86d877b60519\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dkf4t" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.489071 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/38bdf14d-35ac-440b-9a16-9a4ddd53df34-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-w8bmq\" (UID: \"38bdf14d-35ac-440b-9a16-9a4ddd53df34\") " pod="openshift-controller-manager/controller-manager-879f6c89f-w8bmq" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.489153 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/59c9844c-00fe-42cd-add6-9ab528da273d-service-ca-bundle\") pod \"authentication-operator-69f744f599-fq25r\" (UID: \"59c9844c-00fe-42cd-add6-9ab528da273d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fq25r" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.489235 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djqvw\" (UniqueName: \"kubernetes.io/projected/a09fb76f-4291-4945-8bb0-15c478a35cbf-kube-api-access-djqvw\") pod \"machine-approver-56656f9798-6bh9j\" (UID: \"a09fb76f-4291-4945-8bb0-15c478a35cbf\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6bh9j" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.489334 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7141f48f-7462-4e1a-a90f-b8ff3b9a8d9f-client-ca\") pod \"route-controller-manager-6576b87f9c-bfmpl\" (UID: \"7141f48f-7462-4e1a-a90f-b8ff3b9a8d9f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bfmpl" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.489429 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kb89h\" (UniqueName: \"kubernetes.io/projected/7141f48f-7462-4e1a-a90f-b8ff3b9a8d9f-kube-api-access-kb89h\") pod \"route-controller-manager-6576b87f9c-bfmpl\" (UID: \"7141f48f-7462-4e1a-a90f-b8ff3b9a8d9f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bfmpl" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.489691 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/21ede3a0-b26f-4e29-8a13-86d877b60519-encryption-config\") pod \"apiserver-7bbb656c7d-dkf4t\" (UID: \"21ede3a0-b26f-4e29-8a13-86d877b60519\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dkf4t" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.489764 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7141f48f-7462-4e1a-a90f-b8ff3b9a8d9f-serving-cert\") pod \"route-controller-manager-6576b87f9c-bfmpl\" (UID: \"7141f48f-7462-4e1a-a90f-b8ff3b9a8d9f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bfmpl" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.489900 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/59c9844c-00fe-42cd-add6-9ab528da273d-serving-cert\") pod \"authentication-operator-69f744f599-fq25r\" (UID: \"59c9844c-00fe-42cd-add6-9ab528da273d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fq25r" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.489973 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7141f48f-7462-4e1a-a90f-b8ff3b9a8d9f-config\") pod \"route-controller-manager-6576b87f9c-bfmpl\" (UID: \"7141f48f-7462-4e1a-a90f-b8ff3b9a8d9f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bfmpl" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.490057 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/21ede3a0-b26f-4e29-8a13-86d877b60519-audit-policies\") pod \"apiserver-7bbb656c7d-dkf4t\" (UID: \"21ede3a0-b26f-4e29-8a13-86d877b60519\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dkf4t" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.488995 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/21ede3a0-b26f-4e29-8a13-86d877b60519-audit-dir\") pod \"apiserver-7bbb656c7d-dkf4t\" (UID: \"21ede3a0-b26f-4e29-8a13-86d877b60519\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dkf4t" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.490369 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/fccccb67-888f-4a34-a701-61926e9819a6-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-vb7wk\" (UID: \"fccccb67-888f-4a34-a701-61926e9819a6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vb7wk" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.490947 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/59c9844c-00fe-42cd-add6-9ab528da273d-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-fq25r\" (UID: \"59c9844c-00fe-42cd-add6-9ab528da273d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fq25r" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.491081 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/21ede3a0-b26f-4e29-8a13-86d877b60519-audit-policies\") pod \"apiserver-7bbb656c7d-dkf4t\" (UID: \"21ede3a0-b26f-4e29-8a13-86d877b60519\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dkf4t" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.491167 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/21ede3a0-b26f-4e29-8a13-86d877b60519-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-dkf4t\" (UID: \"21ede3a0-b26f-4e29-8a13-86d877b60519\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dkf4t" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.491226 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/21ede3a0-b26f-4e29-8a13-86d877b60519-serving-cert\") pod \"apiserver-7bbb656c7d-dkf4t\" (UID: \"21ede3a0-b26f-4e29-8a13-86d877b60519\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dkf4t" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.490392 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.491395 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/fccccb67-888f-4a34-a701-61926e9819a6-images\") pod \"machine-api-operator-5694c8668f-vb7wk\" (UID: \"fccccb67-888f-4a34-a701-61926e9819a6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vb7wk" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.491473 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/38bdf14d-35ac-440b-9a16-9a4ddd53df34-serving-cert\") pod \"controller-manager-879f6c89f-w8bmq\" (UID: \"38bdf14d-35ac-440b-9a16-9a4ddd53df34\") " pod="openshift-controller-manager/controller-manager-879f6c89f-w8bmq" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.491540 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/38bdf14d-35ac-440b-9a16-9a4ddd53df34-client-ca\") pod \"controller-manager-879f6c89f-w8bmq\" (UID: \"38bdf14d-35ac-440b-9a16-9a4ddd53df34\") " pod="openshift-controller-manager/controller-manager-879f6c89f-w8bmq" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.491619 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptvs7\" (UniqueName: \"kubernetes.io/projected/fccccb67-888f-4a34-a701-61926e9819a6-kube-api-access-ptvs7\") pod \"machine-api-operator-5694c8668f-vb7wk\" (UID: \"fccccb67-888f-4a34-a701-61926e9819a6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vb7wk" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.491701 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38bdf14d-35ac-440b-9a16-9a4ddd53df34-config\") pod \"controller-manager-879f6c89f-w8bmq\" (UID: \"38bdf14d-35ac-440b-9a16-9a4ddd53df34\") " pod="openshift-controller-manager/controller-manager-879f6c89f-w8bmq" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.491775 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59c9844c-00fe-42cd-add6-9ab528da273d-config\") pod \"authentication-operator-69f744f599-fq25r\" (UID: \"59c9844c-00fe-42cd-add6-9ab528da273d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fq25r" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.491842 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/21ede3a0-b26f-4e29-8a13-86d877b60519-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-dkf4t\" (UID: \"21ede3a0-b26f-4e29-8a13-86d877b60519\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dkf4t" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.491735 4698 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-shwvb"] Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.491854 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a09fb76f-4291-4945-8bb0-15c478a35cbf-auth-proxy-config\") pod \"machine-approver-56656f9798-6bh9j\" (UID: \"a09fb76f-4291-4945-8bb0-15c478a35cbf\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6bh9j" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.491810 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5tdjl" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.492304 4698 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-265tl"] Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.492531 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-shwvb" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.493924 4698 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-gnxh7"] Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.494200 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/21ede3a0-b26f-4e29-8a13-86d877b60519-encryption-config\") pod \"apiserver-7bbb656c7d-dkf4t\" (UID: \"21ede3a0-b26f-4e29-8a13-86d877b60519\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dkf4t" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.494244 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-265tl" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.503695 4698 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-f69tr"] Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.503789 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gnxh7" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.504336 4698 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bz5kc"] Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.504434 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-f69tr" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.508570 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.509640 4698 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-csv7z"] Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.510237 4698 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-79f62"] Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.510906 4698 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-jzvrd"] Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.512024 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-fq25r"] Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.512107 4698 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-fd5xc"] Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.513467 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bz5kc" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.513924 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-79f62" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.514169 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jzvrd" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.514296 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-csv7z" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.518733 4698 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sn5g8"] Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.520037 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-fd5xc" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.520115 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sn5g8" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.524402 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.526086 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-bfmpl"] Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.528154 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.530814 4698 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-w5v5m"] Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.531735 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-w5v5m" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.534608 4698 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-97zrr"] Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.535101 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-97zrr" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.535821 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-vb7wk"] Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.536820 4698 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5582h"] Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.537362 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5582h" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.537860 4698 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rkxpp"] Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.538531 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rkxpp" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.539076 4698 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-z54pv"] Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.539605 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-z54pv" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.540356 4698 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-skflh"] Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.541056 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-skflh" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.541430 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-clwmh"] Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.542536 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-w8bmq"] Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.546345 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-dkf4t"] Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.548044 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-v6qbj"] Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.549745 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-lcjcd"] Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.549779 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-lqzfp"] Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.552328 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jrgwq"] Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.552996 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-snxxh"] Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.554250 4698 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-6h5bj"] Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.554901 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-z42jf"] Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.555008 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-6h5bj" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.555310 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.556973 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-f69tr"] Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.558737 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-shwvb"] Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.559778 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-qzmkf"] Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.561111 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-csv7z"] Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.562982 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-fd5xc"] Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.564188 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-gnxh7"] Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.564731 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.566137 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-5ndrw"] Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.567363 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-x8tds"] Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.568515 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-bmr2l"] Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.569680 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29532135-2wtjd"] Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.570676 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xcfhs"] Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.571838 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5tdjl"] Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.572823 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-79f62"] Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.573833 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-w5v5m"] Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.575232 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-hxxxs"] Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.576358 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-265tl"] Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.577559 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-97zrr"] Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.578649 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-jzvrd"] Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.579656 4698 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-6kd89"] Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.580438 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-6kd89" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.581506 4698 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-fbt6k"] Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.583949 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.585229 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5582h"] Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.585343 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-fbt6k" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.587452 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rkxpp"] Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.588711 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-6h5bj"] Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.589678 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-skflh"] Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.591935 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bz5kc"] Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.593089 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1539b772-1d04-4bc9-85f1-99a99b5d237d-serving-cert\") pod \"etcd-operator-b45778765-5ndrw\" (UID: \"1539b772-1d04-4bc9-85f1-99a99b5d237d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5ndrw" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.593152 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1539b772-1d04-4bc9-85f1-99a99b5d237d-config\") pod \"etcd-operator-b45778765-5ndrw\" (UID: \"1539b772-1d04-4bc9-85f1-99a99b5d237d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5ndrw" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.593457 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a09fb76f-4291-4945-8bb0-15c478a35cbf-auth-proxy-config\") pod \"machine-approver-56656f9798-6bh9j\" (UID: \"a09fb76f-4291-4945-8bb0-15c478a35cbf\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6bh9j" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.593518 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ff6t5\" (UniqueName: \"kubernetes.io/projected/1539b772-1d04-4bc9-85f1-99a99b5d237d-kube-api-access-ff6t5\") pod \"etcd-operator-b45778765-5ndrw\" (UID: \"1539b772-1d04-4bc9-85f1-99a99b5d237d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5ndrw" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.593563 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1539b772-1d04-4bc9-85f1-99a99b5d237d-etcd-client\") pod \"etcd-operator-b45778765-5ndrw\" (UID: \"1539b772-1d04-4bc9-85f1-99a99b5d237d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5ndrw" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.593594 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a09fb76f-4291-4945-8bb0-15c478a35cbf-config\") pod \"machine-approver-56656f9798-6bh9j\" (UID: \"a09fb76f-4291-4945-8bb0-15c478a35cbf\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6bh9j" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.593628 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/a09fb76f-4291-4945-8bb0-15c478a35cbf-machine-approver-tls\") pod \"machine-approver-56656f9798-6bh9j\" (UID: \"a09fb76f-4291-4945-8bb0-15c478a35cbf\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6bh9j" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.593717 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djqvw\" (UniqueName: \"kubernetes.io/projected/a09fb76f-4291-4945-8bb0-15c478a35cbf-kube-api-access-djqvw\") pod \"machine-approver-56656f9798-6bh9j\" (UID: \"a09fb76f-4291-4945-8bb0-15c478a35cbf\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6bh9j" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.593982 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/1539b772-1d04-4bc9-85f1-99a99b5d237d-etcd-ca\") pod \"etcd-operator-b45778765-5ndrw\" (UID: \"1539b772-1d04-4bc9-85f1-99a99b5d237d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5ndrw" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.594018 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/1539b772-1d04-4bc9-85f1-99a99b5d237d-etcd-service-ca\") pod \"etcd-operator-b45778765-5ndrw\" (UID: \"1539b772-1d04-4bc9-85f1-99a99b5d237d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5ndrw" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.594947 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a09fb76f-4291-4945-8bb0-15c478a35cbf-auth-proxy-config\") pod \"machine-approver-56656f9798-6bh9j\" (UID: \"a09fb76f-4291-4945-8bb0-15c478a35cbf\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6bh9j" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.595443 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a09fb76f-4291-4945-8bb0-15c478a35cbf-config\") pod \"machine-approver-56656f9798-6bh9j\" (UID: \"a09fb76f-4291-4945-8bb0-15c478a35cbf\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6bh9j" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.595509 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sn5g8"] Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.595548 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-6kd89"] Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.597769 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-fbt6k"] Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.598743 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/a09fb76f-4291-4945-8bb0-15c478a35cbf-machine-approver-tls\") pod \"machine-approver-56656f9798-6bh9j\" (UID: \"a09fb76f-4291-4945-8bb0-15c478a35cbf\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6bh9j" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.601717 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-z54pv"] Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.602860 4698 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-z2fjr"] Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.603541 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-z2fjr" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.604498 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.624873 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.694614 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/1539b772-1d04-4bc9-85f1-99a99b5d237d-etcd-ca\") pod \"etcd-operator-b45778765-5ndrw\" (UID: \"1539b772-1d04-4bc9-85f1-99a99b5d237d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5ndrw" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.694652 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/1539b772-1d04-4bc9-85f1-99a99b5d237d-etcd-service-ca\") pod \"etcd-operator-b45778765-5ndrw\" (UID: \"1539b772-1d04-4bc9-85f1-99a99b5d237d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5ndrw" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.694688 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1539b772-1d04-4bc9-85f1-99a99b5d237d-serving-cert\") pod \"etcd-operator-b45778765-5ndrw\" (UID: \"1539b772-1d04-4bc9-85f1-99a99b5d237d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5ndrw" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.694723 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1539b772-1d04-4bc9-85f1-99a99b5d237d-config\") pod \"etcd-operator-b45778765-5ndrw\" (UID: \"1539b772-1d04-4bc9-85f1-99a99b5d237d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5ndrw" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.694788 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1539b772-1d04-4bc9-85f1-99a99b5d237d-etcd-client\") pod \"etcd-operator-b45778765-5ndrw\" (UID: \"1539b772-1d04-4bc9-85f1-99a99b5d237d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5ndrw" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.694812 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ff6t5\" (UniqueName: \"kubernetes.io/projected/1539b772-1d04-4bc9-85f1-99a99b5d237d-kube-api-access-ff6t5\") pod \"etcd-operator-b45778765-5ndrw\" (UID: \"1539b772-1d04-4bc9-85f1-99a99b5d237d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5ndrw" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.695506 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/1539b772-1d04-4bc9-85f1-99a99b5d237d-etcd-service-ca\") pod \"etcd-operator-b45778765-5ndrw\" (UID: \"1539b772-1d04-4bc9-85f1-99a99b5d237d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5ndrw" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.695505 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/1539b772-1d04-4bc9-85f1-99a99b5d237d-etcd-ca\") pod \"etcd-operator-b45778765-5ndrw\" (UID: \"1539b772-1d04-4bc9-85f1-99a99b5d237d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5ndrw" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.695614 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1539b772-1d04-4bc9-85f1-99a99b5d237d-config\") pod \"etcd-operator-b45778765-5ndrw\" (UID: \"1539b772-1d04-4bc9-85f1-99a99b5d237d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5ndrw" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.697487 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1539b772-1d04-4bc9-85f1-99a99b5d237d-serving-cert\") pod \"etcd-operator-b45778765-5ndrw\" (UID: \"1539b772-1d04-4bc9-85f1-99a99b5d237d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5ndrw" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.698274 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1539b772-1d04-4bc9-85f1-99a99b5d237d-etcd-client\") pod \"etcd-operator-b45778765-5ndrw\" (UID: \"1539b772-1d04-4bc9-85f1-99a99b5d237d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5ndrw" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.698475 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s22m7\" (UniqueName: \"kubernetes.io/projected/21ede3a0-b26f-4e29-8a13-86d877b60519-kube-api-access-s22m7\") pod \"apiserver-7bbb656c7d-dkf4t\" (UID: \"21ede3a0-b26f-4e29-8a13-86d877b60519\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dkf4t" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.718562 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"7b666519b9d3b28b1d27e30db1fb2f4fbccb8b24311b5ec02ed5a7a5b1d13116"} Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.718623 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"7b08de0bbd521705969f8228154a335e0676d17fb2013dbb91a1218433dd16c7"} Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.719669 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"7f5aaa2b6a403a0b01f1aa5db56a86861e95ce93ca6a60a68e59ea672662e9e3"} Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.719713 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"414685c741fb688e639952a74909befabbc702ba04237860b7a65345dd1d8c37"} Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.721108 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"9f5741db638d24339660fc7733d462fff9a8a4027371576df0fdd3e16e4b84f1"} Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.721131 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"93f45c41f68c78a931dc9d646dd09892b8d674803e982703d7846acbf8bde046"} Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.744514 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.764911 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.784860 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.792308 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dkf4t" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.805753 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.825105 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.846151 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.864994 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.885025 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.905107 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.925117 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.945019 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.947708 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-dkf4t"] Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.964717 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 24 10:20:00 crc kubenswrapper[4698]: I0224 10:20:00.985418 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 24 10:20:01 crc kubenswrapper[4698]: I0224 10:20:01.004411 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 24 10:20:01 crc kubenswrapper[4698]: I0224 10:20:01.025050 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 24 10:20:01 crc kubenswrapper[4698]: I0224 10:20:01.044623 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 24 10:20:01 crc kubenswrapper[4698]: I0224 10:20:01.064774 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 24 10:20:01 crc kubenswrapper[4698]: I0224 10:20:01.124598 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 24 10:20:01 crc kubenswrapper[4698]: I0224 10:20:01.143917 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 24 10:20:01 crc kubenswrapper[4698]: I0224 10:20:01.167028 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 24 10:20:01 crc kubenswrapper[4698]: I0224 10:20:01.183889 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 24 10:20:01 crc kubenswrapper[4698]: I0224 10:20:01.204298 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 24 10:20:01 crc kubenswrapper[4698]: I0224 10:20:01.224117 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 24 10:20:01 crc kubenswrapper[4698]: I0224 10:20:01.244912 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 24 10:20:01 crc kubenswrapper[4698]: I0224 10:20:01.264452 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 24 10:20:01 crc kubenswrapper[4698]: I0224 10:20:01.284311 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 24 10:20:01 crc kubenswrapper[4698]: I0224 10:20:01.304914 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 24 10:20:01 crc kubenswrapper[4698]: I0224 10:20:01.324372 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 24 10:20:01 crc kubenswrapper[4698]: I0224 10:20:01.345287 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 24 10:20:01 crc kubenswrapper[4698]: I0224 10:20:01.365290 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 24 10:20:01 crc kubenswrapper[4698]: I0224 10:20:01.393948 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 24 10:20:01 crc kubenswrapper[4698]: I0224 10:20:01.404390 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 24 10:20:01 crc kubenswrapper[4698]: I0224 10:20:01.425056 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 24 10:20:01 crc kubenswrapper[4698]: I0224 10:20:01.444580 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 24 10:20:01 crc kubenswrapper[4698]: I0224 10:20:01.465179 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 24 10:20:01 crc kubenswrapper[4698]: E0224 10:20:01.478465 4698 configmap.go:193] Couldn't get configMap openshift-machine-api/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Feb 24 10:20:01 crc kubenswrapper[4698]: E0224 10:20:01.478560 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/fccccb67-888f-4a34-a701-61926e9819a6-config podName:fccccb67-888f-4a34-a701-61926e9819a6 nodeName:}" failed. No retries permitted until 2026-02-24 10:20:01.978534951 +0000 UTC m=+227.092149212 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/fccccb67-888f-4a34-a701-61926e9819a6-config") pod "machine-api-operator-5694c8668f-vb7wk" (UID: "fccccb67-888f-4a34-a701-61926e9819a6") : failed to sync configmap cache: timed out waiting for the condition Feb 24 10:20:01 crc kubenswrapper[4698]: I0224 10:20:01.484843 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 24 10:20:01 crc kubenswrapper[4698]: E0224 10:20:01.490152 4698 configmap.go:193] Couldn't get configMap openshift-controller-manager/openshift-global-ca: failed to sync configmap cache: timed out waiting for the condition Feb 24 10:20:01 crc kubenswrapper[4698]: E0224 10:20:01.490175 4698 secret.go:188] Couldn't get secret openshift-route-controller-manager/serving-cert: failed to sync secret cache: timed out waiting for the condition Feb 24 10:20:01 crc kubenswrapper[4698]: E0224 10:20:01.490207 4698 configmap.go:193] Couldn't get configMap openshift-authentication-operator/service-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Feb 24 10:20:01 crc kubenswrapper[4698]: E0224 10:20:01.490211 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/38bdf14d-35ac-440b-9a16-9a4ddd53df34-proxy-ca-bundles podName:38bdf14d-35ac-440b-9a16-9a4ddd53df34 nodeName:}" failed. No retries permitted until 2026-02-24 10:20:01.990193911 +0000 UTC m=+227.103808162 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-ca-bundles" (UniqueName: "kubernetes.io/configmap/38bdf14d-35ac-440b-9a16-9a4ddd53df34-proxy-ca-bundles") pod "controller-manager-879f6c89f-w8bmq" (UID: "38bdf14d-35ac-440b-9a16-9a4ddd53df34") : failed to sync configmap cache: timed out waiting for the condition Feb 24 10:20:01 crc kubenswrapper[4698]: E0224 10:20:01.490155 4698 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: failed to sync configmap cache: timed out waiting for the condition Feb 24 10:20:01 crc kubenswrapper[4698]: E0224 10:20:01.490230 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7141f48f-7462-4e1a-a90f-b8ff3b9a8d9f-serving-cert podName:7141f48f-7462-4e1a-a90f-b8ff3b9a8d9f nodeName:}" failed. No retries permitted until 2026-02-24 10:20:01.990220282 +0000 UTC m=+227.103834543 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/7141f48f-7462-4e1a-a90f-b8ff3b9a8d9f-serving-cert") pod "route-controller-manager-6576b87f9c-bfmpl" (UID: "7141f48f-7462-4e1a-a90f-b8ff3b9a8d9f") : failed to sync secret cache: timed out waiting for the condition Feb 24 10:20:01 crc kubenswrapper[4698]: E0224 10:20:01.490244 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/59c9844c-00fe-42cd-add6-9ab528da273d-service-ca-bundle podName:59c9844c-00fe-42cd-add6-9ab528da273d nodeName:}" failed. No retries permitted until 2026-02-24 10:20:01.990237842 +0000 UTC m=+227.103852093 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/59c9844c-00fe-42cd-add6-9ab528da273d-service-ca-bundle") pod "authentication-operator-69f744f599-fq25r" (UID: "59c9844c-00fe-42cd-add6-9ab528da273d") : failed to sync configmap cache: timed out waiting for the condition Feb 24 10:20:01 crc kubenswrapper[4698]: E0224 10:20:01.490290 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7141f48f-7462-4e1a-a90f-b8ff3b9a8d9f-client-ca podName:7141f48f-7462-4e1a-a90f-b8ff3b9a8d9f nodeName:}" failed. No retries permitted until 2026-02-24 10:20:01.990252873 +0000 UTC m=+227.103867124 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/7141f48f-7462-4e1a-a90f-b8ff3b9a8d9f-client-ca") pod "route-controller-manager-6576b87f9c-bfmpl" (UID: "7141f48f-7462-4e1a-a90f-b8ff3b9a8d9f") : failed to sync configmap cache: timed out waiting for the condition Feb 24 10:20:01 crc kubenswrapper[4698]: E0224 10:20:01.491391 4698 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/config: failed to sync configmap cache: timed out waiting for the condition Feb 24 10:20:01 crc kubenswrapper[4698]: E0224 10:20:01.491426 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7141f48f-7462-4e1a-a90f-b8ff3b9a8d9f-config podName:7141f48f-7462-4e1a-a90f-b8ff3b9a8d9f nodeName:}" failed. No retries permitted until 2026-02-24 10:20:01.99141598 +0000 UTC m=+227.105030221 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/7141f48f-7462-4e1a-a90f-b8ff3b9a8d9f-config") pod "route-controller-manager-6576b87f9c-bfmpl" (UID: "7141f48f-7462-4e1a-a90f-b8ff3b9a8d9f") : failed to sync configmap cache: timed out waiting for the condition Feb 24 10:20:01 crc kubenswrapper[4698]: E0224 10:20:01.491392 4698 configmap.go:193] Couldn't get configMap openshift-authentication-operator/trusted-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Feb 24 10:20:01 crc kubenswrapper[4698]: E0224 10:20:01.491443 4698 secret.go:188] Couldn't get secret openshift-machine-api/machine-api-operator-tls: failed to sync secret cache: timed out waiting for the condition Feb 24 10:20:01 crc kubenswrapper[4698]: E0224 10:20:01.491455 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/59c9844c-00fe-42cd-add6-9ab528da273d-trusted-ca-bundle podName:59c9844c-00fe-42cd-add6-9ab528da273d nodeName:}" failed. No retries permitted until 2026-02-24 10:20:01.991448291 +0000 UTC m=+227.105062532 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/59c9844c-00fe-42cd-add6-9ab528da273d-trusted-ca-bundle") pod "authentication-operator-69f744f599-fq25r" (UID: "59c9844c-00fe-42cd-add6-9ab528da273d") : failed to sync configmap cache: timed out waiting for the condition Feb 24 10:20:01 crc kubenswrapper[4698]: E0224 10:20:01.491521 4698 configmap.go:193] Couldn't get configMap openshift-machine-api/machine-api-operator-images: failed to sync configmap cache: timed out waiting for the condition Feb 24 10:20:01 crc kubenswrapper[4698]: E0224 10:20:01.491556 4698 secret.go:188] Couldn't get secret openshift-controller-manager/serving-cert: failed to sync secret cache: timed out waiting for the condition Feb 24 10:20:01 crc kubenswrapper[4698]: E0224 10:20:01.491530 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fccccb67-888f-4a34-a701-61926e9819a6-machine-api-operator-tls podName:fccccb67-888f-4a34-a701-61926e9819a6 nodeName:}" failed. No retries permitted until 2026-02-24 10:20:01.991479521 +0000 UTC m=+227.105093772 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "machine-api-operator-tls" (UniqueName: "kubernetes.io/secret/fccccb67-888f-4a34-a701-61926e9819a6-machine-api-operator-tls") pod "machine-api-operator-5694c8668f-vb7wk" (UID: "fccccb67-888f-4a34-a701-61926e9819a6") : failed to sync secret cache: timed out waiting for the condition Feb 24 10:20:01 crc kubenswrapper[4698]: E0224 10:20:01.491655 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/fccccb67-888f-4a34-a701-61926e9819a6-images podName:fccccb67-888f-4a34-a701-61926e9819a6 nodeName:}" failed. No retries permitted until 2026-02-24 10:20:01.991641475 +0000 UTC m=+227.105255726 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "images" (UniqueName: "kubernetes.io/configmap/fccccb67-888f-4a34-a701-61926e9819a6-images") pod "machine-api-operator-5694c8668f-vb7wk" (UID: "fccccb67-888f-4a34-a701-61926e9819a6") : failed to sync configmap cache: timed out waiting for the condition Feb 24 10:20:01 crc kubenswrapper[4698]: E0224 10:20:01.491723 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38bdf14d-35ac-440b-9a16-9a4ddd53df34-serving-cert podName:38bdf14d-35ac-440b-9a16-9a4ddd53df34 nodeName:}" failed. No retries permitted until 2026-02-24 10:20:01.991707197 +0000 UTC m=+227.105321448 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/38bdf14d-35ac-440b-9a16-9a4ddd53df34-serving-cert") pod "controller-manager-879f6c89f-w8bmq" (UID: "38bdf14d-35ac-440b-9a16-9a4ddd53df34") : failed to sync secret cache: timed out waiting for the condition Feb 24 10:20:01 crc kubenswrapper[4698]: E0224 10:20:01.491757 4698 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: failed to sync configmap cache: timed out waiting for the condition Feb 24 10:20:01 crc kubenswrapper[4698]: E0224 10:20:01.491795 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/38bdf14d-35ac-440b-9a16-9a4ddd53df34-client-ca podName:38bdf14d-35ac-440b-9a16-9a4ddd53df34 nodeName:}" failed. No retries permitted until 2026-02-24 10:20:01.991781688 +0000 UTC m=+227.105395929 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/38bdf14d-35ac-440b-9a16-9a4ddd53df34-client-ca") pod "controller-manager-879f6c89f-w8bmq" (UID: "38bdf14d-35ac-440b-9a16-9a4ddd53df34") : failed to sync configmap cache: timed out waiting for the condition Feb 24 10:20:01 crc kubenswrapper[4698]: E0224 10:20:01.491827 4698 configmap.go:193] Couldn't get configMap openshift-controller-manager/config: failed to sync configmap cache: timed out waiting for the condition Feb 24 10:20:01 crc kubenswrapper[4698]: E0224 10:20:01.491881 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/38bdf14d-35ac-440b-9a16-9a4ddd53df34-config podName:38bdf14d-35ac-440b-9a16-9a4ddd53df34 nodeName:}" failed. No retries permitted until 2026-02-24 10:20:01.99187285 +0000 UTC m=+227.105487091 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/38bdf14d-35ac-440b-9a16-9a4ddd53df34-config") pod "controller-manager-879f6c89f-w8bmq" (UID: "38bdf14d-35ac-440b-9a16-9a4ddd53df34") : failed to sync configmap cache: timed out waiting for the condition Feb 24 10:20:01 crc kubenswrapper[4698]: E0224 10:20:01.491914 4698 configmap.go:193] Couldn't get configMap openshift-authentication-operator/authentication-operator-config: failed to sync configmap cache: timed out waiting for the condition Feb 24 10:20:01 crc kubenswrapper[4698]: E0224 10:20:01.491937 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/59c9844c-00fe-42cd-add6-9ab528da273d-config podName:59c9844c-00fe-42cd-add6-9ab528da273d nodeName:}" failed. No retries permitted until 2026-02-24 10:20:01.991930722 +0000 UTC m=+227.105544963 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/59c9844c-00fe-42cd-add6-9ab528da273d-config") pod "authentication-operator-69f744f599-fq25r" (UID: "59c9844c-00fe-42cd-add6-9ab528da273d") : failed to sync configmap cache: timed out waiting for the condition Feb 24 10:20:01 crc kubenswrapper[4698]: E0224 10:20:01.492667 4698 secret.go:188] Couldn't get secret openshift-authentication-operator/serving-cert: failed to sync secret cache: timed out waiting for the condition Feb 24 10:20:01 crc kubenswrapper[4698]: E0224 10:20:01.492711 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/59c9844c-00fe-42cd-add6-9ab528da273d-serving-cert podName:59c9844c-00fe-42cd-add6-9ab528da273d nodeName:}" failed. No retries permitted until 2026-02-24 10:20:01.99269836 +0000 UTC m=+227.106312601 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/59c9844c-00fe-42cd-add6-9ab528da273d-serving-cert") pod "authentication-operator-69f744f599-fq25r" (UID: "59c9844c-00fe-42cd-add6-9ab528da273d") : failed to sync secret cache: timed out waiting for the condition Feb 24 10:20:01 crc kubenswrapper[4698]: I0224 10:20:01.503682 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 24 10:20:01 crc kubenswrapper[4698]: I0224 10:20:01.523628 4698 request.go:700] Waited for 1.007713923s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&limit=500&resourceVersion=0 Feb 24 10:20:01 crc kubenswrapper[4698]: I0224 10:20:01.524541 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 24 10:20:01 crc kubenswrapper[4698]: I0224 10:20:01.545144 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 24 10:20:01 crc kubenswrapper[4698]: I0224 10:20:01.564736 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 24 10:20:01 crc kubenswrapper[4698]: I0224 10:20:01.583793 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 24 10:20:01 crc kubenswrapper[4698]: I0224 10:20:01.614874 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 24 10:20:01 crc kubenswrapper[4698]: I0224 10:20:01.627498 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 24 10:20:01 crc kubenswrapper[4698]: I0224 10:20:01.644629 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 24 10:20:01 crc kubenswrapper[4698]: I0224 10:20:01.665373 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 24 10:20:01 crc kubenswrapper[4698]: I0224 10:20:01.685173 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 24 10:20:01 crc kubenswrapper[4698]: I0224 10:20:01.704575 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 24 10:20:01 crc kubenswrapper[4698]: E0224 10:20:01.716808 4698 projected.go:288] Couldn't get configMap openshift-authentication-operator/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Feb 24 10:20:01 crc kubenswrapper[4698]: I0224 10:20:01.724837 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 24 10:20:01 crc kubenswrapper[4698]: I0224 10:20:01.725779 4698 generic.go:334] "Generic (PLEG): container finished" podID="21ede3a0-b26f-4e29-8a13-86d877b60519" containerID="13ccf9770cc92587356f83b563ab35cbe272fb0d5a918e9527fb2bfffd7d26b8" exitCode=0 Feb 24 10:20:01 crc kubenswrapper[4698]: I0224 10:20:01.725904 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dkf4t" event={"ID":"21ede3a0-b26f-4e29-8a13-86d877b60519","Type":"ContainerDied","Data":"13ccf9770cc92587356f83b563ab35cbe272fb0d5a918e9527fb2bfffd7d26b8"} Feb 24 10:20:01 crc kubenswrapper[4698]: I0224 10:20:01.725938 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dkf4t" event={"ID":"21ede3a0-b26f-4e29-8a13-86d877b60519","Type":"ContainerStarted","Data":"aeed9a8a0ec3bf1bc09aeda092fefaa41e7537e39d09317d2e99be46abab6ffb"} Feb 24 10:20:01 crc kubenswrapper[4698]: E0224 10:20:01.735659 4698 projected.go:288] Couldn't get configMap openshift-controller-manager/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Feb 24 10:20:01 crc kubenswrapper[4698]: I0224 10:20:01.745289 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 24 10:20:01 crc kubenswrapper[4698]: I0224 10:20:01.786014 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 24 10:20:01 crc kubenswrapper[4698]: I0224 10:20:01.792986 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 24 10:20:01 crc kubenswrapper[4698]: I0224 10:20:01.804425 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 24 10:20:01 crc kubenswrapper[4698]: I0224 10:20:01.823949 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 24 10:20:01 crc kubenswrapper[4698]: I0224 10:20:01.844606 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 24 10:20:01 crc kubenswrapper[4698]: I0224 10:20:01.864733 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 24 10:20:01 crc kubenswrapper[4698]: I0224 10:20:01.884765 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 24 10:20:01 crc kubenswrapper[4698]: I0224 10:20:01.905599 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 24 10:20:01 crc kubenswrapper[4698]: I0224 10:20:01.924371 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 24 10:20:01 crc kubenswrapper[4698]: I0224 10:20:01.945080 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 24 10:20:01 crc kubenswrapper[4698]: I0224 10:20:01.965055 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 24 10:20:01 crc kubenswrapper[4698]: I0224 10:20:01.985327 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.004511 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.009398 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/59c9844c-00fe-42cd-add6-9ab528da273d-serving-cert\") pod \"authentication-operator-69f744f599-fq25r\" (UID: \"59c9844c-00fe-42cd-add6-9ab528da273d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fq25r" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.009467 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7141f48f-7462-4e1a-a90f-b8ff3b9a8d9f-config\") pod \"route-controller-manager-6576b87f9c-bfmpl\" (UID: \"7141f48f-7462-4e1a-a90f-b8ff3b9a8d9f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bfmpl" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.009507 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/fccccb67-888f-4a34-a701-61926e9819a6-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-vb7wk\" (UID: \"fccccb67-888f-4a34-a701-61926e9819a6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vb7wk" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.009565 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/59c9844c-00fe-42cd-add6-9ab528da273d-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-fq25r\" (UID: \"59c9844c-00fe-42cd-add6-9ab528da273d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fq25r" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.009595 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/fccccb67-888f-4a34-a701-61926e9819a6-images\") pod \"machine-api-operator-5694c8668f-vb7wk\" (UID: \"fccccb67-888f-4a34-a701-61926e9819a6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vb7wk" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.009652 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/38bdf14d-35ac-440b-9a16-9a4ddd53df34-serving-cert\") pod \"controller-manager-879f6c89f-w8bmq\" (UID: \"38bdf14d-35ac-440b-9a16-9a4ddd53df34\") " pod="openshift-controller-manager/controller-manager-879f6c89f-w8bmq" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.009678 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/38bdf14d-35ac-440b-9a16-9a4ddd53df34-client-ca\") pod \"controller-manager-879f6c89f-w8bmq\" (UID: \"38bdf14d-35ac-440b-9a16-9a4ddd53df34\") " pod="openshift-controller-manager/controller-manager-879f6c89f-w8bmq" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.009748 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38bdf14d-35ac-440b-9a16-9a4ddd53df34-config\") pod \"controller-manager-879f6c89f-w8bmq\" (UID: \"38bdf14d-35ac-440b-9a16-9a4ddd53df34\") " pod="openshift-controller-manager/controller-manager-879f6c89f-w8bmq" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.009802 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59c9844c-00fe-42cd-add6-9ab528da273d-config\") pod \"authentication-operator-69f744f599-fq25r\" (UID: \"59c9844c-00fe-42cd-add6-9ab528da273d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fq25r" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.009899 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fccccb67-888f-4a34-a701-61926e9819a6-config\") pod \"machine-api-operator-5694c8668f-vb7wk\" (UID: \"fccccb67-888f-4a34-a701-61926e9819a6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vb7wk" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.009957 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/38bdf14d-35ac-440b-9a16-9a4ddd53df34-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-w8bmq\" (UID: \"38bdf14d-35ac-440b-9a16-9a4ddd53df34\") " pod="openshift-controller-manager/controller-manager-879f6c89f-w8bmq" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.009988 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/59c9844c-00fe-42cd-add6-9ab528da273d-service-ca-bundle\") pod \"authentication-operator-69f744f599-fq25r\" (UID: \"59c9844c-00fe-42cd-add6-9ab528da273d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fq25r" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.010060 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7141f48f-7462-4e1a-a90f-b8ff3b9a8d9f-client-ca\") pod \"route-controller-manager-6576b87f9c-bfmpl\" (UID: \"7141f48f-7462-4e1a-a90f-b8ff3b9a8d9f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bfmpl" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.010088 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7141f48f-7462-4e1a-a90f-b8ff3b9a8d9f-serving-cert\") pod \"route-controller-manager-6576b87f9c-bfmpl\" (UID: \"7141f48f-7462-4e1a-a90f-b8ff3b9a8d9f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bfmpl" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.024762 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.044670 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.063998 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.084335 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 24 10:20:02 crc kubenswrapper[4698]: E0224 10:20:02.095628 4698 projected.go:288] Couldn't get configMap openshift-route-controller-manager/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.104470 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 24 10:20:02 crc kubenswrapper[4698]: E0224 10:20:02.119383 4698 projected.go:288] Couldn't get configMap openshift-machine-api/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.124410 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.144672 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.164036 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.185464 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.205232 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.225152 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.244902 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.265136 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.284931 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.304989 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.324281 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.345025 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.365388 4698 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.384924 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.419812 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djqvw\" (UniqueName: \"kubernetes.io/projected/a09fb76f-4291-4945-8bb0-15c478a35cbf-kube-api-access-djqvw\") pod \"machine-approver-56656f9798-6bh9j\" (UID: \"a09fb76f-4291-4945-8bb0-15c478a35cbf\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6bh9j" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.426507 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.444670 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.464876 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.515206 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/6b1ad070-5898-4b3a-ab57-57d781c9b809-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-x8tds\" (UID: \"6b1ad070-5898-4b3a-ab57-57d781c9b809\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-x8tds" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.515238 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbzx5\" (UniqueName: \"kubernetes.io/projected/effafc66-9dae-4ef3-86a5-72e1fac84fc4-kube-api-access-gbzx5\") pod \"openshift-apiserver-operator-796bbdcf4f-jrgwq\" (UID: \"effafc66-9dae-4ef3-86a5-72e1fac84fc4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jrgwq" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.515273 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/568f96c6-6a68-4e06-a1e1-1b787f58bac7-audit-dir\") pod \"apiserver-76f77b778f-lqzfp\" (UID: \"568f96c6-6a68-4e06-a1e1-1b787f58bac7\") " pod="openshift-apiserver/apiserver-76f77b778f-lqzfp" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.515297 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/348d0b48-f2a9-4326-b8c8-88f43029f382-console-config\") pod \"console-f9d7485db-clwmh\" (UID: \"348d0b48-f2a9-4326-b8c8-88f43029f382\") " pod="openshift-console/console-f9d7485db-clwmh" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.515311 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/348d0b48-f2a9-4326-b8c8-88f43029f382-trusted-ca-bundle\") pod \"console-f9d7485db-clwmh\" (UID: \"348d0b48-f2a9-4326-b8c8-88f43029f382\") " pod="openshift-console/console-f9d7485db-clwmh" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.515337 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/803e0d1c-f298-49b4-9251-9271f311ee92-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-hxxxs\" (UID: \"803e0d1c-f298-49b4-9251-9271f311ee92\") " pod="openshift-authentication/oauth-openshift-558db77b4-hxxxs" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.515352 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/803e0d1c-f298-49b4-9251-9271f311ee92-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-hxxxs\" (UID: \"803e0d1c-f298-49b4-9251-9271f311ee92\") " pod="openshift-authentication/oauth-openshift-558db77b4-hxxxs" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.515367 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/568f96c6-6a68-4e06-a1e1-1b787f58bac7-audit\") pod \"apiserver-76f77b778f-lqzfp\" (UID: \"568f96c6-6a68-4e06-a1e1-1b787f58bac7\") " pod="openshift-apiserver/apiserver-76f77b778f-lqzfp" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.515385 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hk7pb\" (UniqueName: \"kubernetes.io/projected/568f96c6-6a68-4e06-a1e1-1b787f58bac7-kube-api-access-hk7pb\") pod \"apiserver-76f77b778f-lqzfp\" (UID: \"568f96c6-6a68-4e06-a1e1-1b787f58bac7\") " pod="openshift-apiserver/apiserver-76f77b778f-lqzfp" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.515401 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wzkz\" (UniqueName: \"kubernetes.io/projected/9ded6944-ff06-4cd5-beef-4dbb3cb9aba8-kube-api-access-5wzkz\") pod \"image-registry-697d97f7c8-bmr2l\" (UID: \"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8\") " pod="openshift-image-registry/image-registry-697d97f7c8-bmr2l" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.515417 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kr8mq\" (UniqueName: \"kubernetes.io/projected/6b1ad070-5898-4b3a-ab57-57d781c9b809-kube-api-access-kr8mq\") pod \"cluster-image-registry-operator-dc59b4c8b-x8tds\" (UID: \"6b1ad070-5898-4b3a-ab57-57d781c9b809\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-x8tds" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.515433 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9ded6944-ff06-4cd5-beef-4dbb3cb9aba8-bound-sa-token\") pod \"image-registry-697d97f7c8-bmr2l\" (UID: \"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8\") " pod="openshift-image-registry/image-registry-697d97f7c8-bmr2l" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.515447 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/348d0b48-f2a9-4326-b8c8-88f43029f382-console-serving-cert\") pod \"console-f9d7485db-clwmh\" (UID: \"348d0b48-f2a9-4326-b8c8-88f43029f382\") " pod="openshift-console/console-f9d7485db-clwmh" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.515461 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/803e0d1c-f298-49b4-9251-9271f311ee92-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-hxxxs\" (UID: \"803e0d1c-f298-49b4-9251-9271f311ee92\") " pod="openshift-authentication/oauth-openshift-558db77b4-hxxxs" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.515485 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/803e0d1c-f298-49b4-9251-9271f311ee92-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-hxxxs\" (UID: \"803e0d1c-f298-49b4-9251-9271f311ee92\") " pod="openshift-authentication/oauth-openshift-558db77b4-hxxxs" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.515500 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/568f96c6-6a68-4e06-a1e1-1b787f58bac7-trusted-ca-bundle\") pod \"apiserver-76f77b778f-lqzfp\" (UID: \"568f96c6-6a68-4e06-a1e1-1b787f58bac7\") " pod="openshift-apiserver/apiserver-76f77b778f-lqzfp" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.515530 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/348d0b48-f2a9-4326-b8c8-88f43029f382-oauth-serving-cert\") pod \"console-f9d7485db-clwmh\" (UID: \"348d0b48-f2a9-4326-b8c8-88f43029f382\") " pod="openshift-console/console-f9d7485db-clwmh" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.515544 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/568f96c6-6a68-4e06-a1e1-1b787f58bac7-image-import-ca\") pod \"apiserver-76f77b778f-lqzfp\" (UID: \"568f96c6-6a68-4e06-a1e1-1b787f58bac7\") " pod="openshift-apiserver/apiserver-76f77b778f-lqzfp" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.515556 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/312007fb-fd23-4a36-b653-ea3e24a02ee0-metrics-tls\") pod \"dns-operator-744455d44c-v6qbj\" (UID: \"312007fb-fd23-4a36-b653-ea3e24a02ee0\") " pod="openshift-dns-operator/dns-operator-744455d44c-v6qbj" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.515579 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c9967ed-20af-48cf-859d-4c3060d413fb-config\") pod \"console-operator-58897d9998-qzmkf\" (UID: \"9c9967ed-20af-48cf-859d-4c3060d413fb\") " pod="openshift-console-operator/console-operator-58897d9998-qzmkf" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.515595 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9ded6944-ff06-4cd5-beef-4dbb3cb9aba8-registry-certificates\") pod \"image-registry-697d97f7c8-bmr2l\" (UID: \"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8\") " pod="openshift-image-registry/image-registry-697d97f7c8-bmr2l" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.515610 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9c9967ed-20af-48cf-859d-4c3060d413fb-trusted-ca\") pod \"console-operator-58897d9998-qzmkf\" (UID: \"9c9967ed-20af-48cf-859d-4c3060d413fb\") " pod="openshift-console-operator/console-operator-58897d9998-qzmkf" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.515642 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6b1ad070-5898-4b3a-ab57-57d781c9b809-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-x8tds\" (UID: \"6b1ad070-5898-4b3a-ab57-57d781c9b809\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-x8tds" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.515656 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/effafc66-9dae-4ef3-86a5-72e1fac84fc4-config\") pod \"openshift-apiserver-operator-796bbdcf4f-jrgwq\" (UID: \"effafc66-9dae-4ef3-86a5-72e1fac84fc4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jrgwq" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.515675 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74k5q\" (UniqueName: \"kubernetes.io/projected/50f7a0ea-7b15-487b-b907-6fb4c7451eed-kube-api-access-74k5q\") pod \"openshift-config-operator-7777fb866f-lcjcd\" (UID: \"50f7a0ea-7b15-487b-b907-6fb4c7451eed\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-lcjcd" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.515688 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkh9l\" (UniqueName: \"kubernetes.io/projected/803e0d1c-f298-49b4-9251-9271f311ee92-kube-api-access-wkh9l\") pod \"oauth-openshift-558db77b4-hxxxs\" (UID: \"803e0d1c-f298-49b4-9251-9271f311ee92\") " pod="openshift-authentication/oauth-openshift-558db77b4-hxxxs" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.515701 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/568f96c6-6a68-4e06-a1e1-1b787f58bac7-encryption-config\") pod \"apiserver-76f77b778f-lqzfp\" (UID: \"568f96c6-6a68-4e06-a1e1-1b787f58bac7\") " pod="openshift-apiserver/apiserver-76f77b778f-lqzfp" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.515721 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/803e0d1c-f298-49b4-9251-9271f311ee92-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-hxxxs\" (UID: \"803e0d1c-f298-49b4-9251-9271f311ee92\") " pod="openshift-authentication/oauth-openshift-558db77b4-hxxxs" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.515738 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/568f96c6-6a68-4e06-a1e1-1b787f58bac7-etcd-serving-ca\") pod \"apiserver-76f77b778f-lqzfp\" (UID: \"568f96c6-6a68-4e06-a1e1-1b787f58bac7\") " pod="openshift-apiserver/apiserver-76f77b778f-lqzfp" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.515775 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e6dc6e77-8617-4bc0-8960-6b81b87c8b88-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-snxxh\" (UID: \"e6dc6e77-8617-4bc0-8960-6b81b87c8b88\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-snxxh" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.515790 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kgzn\" (UniqueName: \"kubernetes.io/projected/312007fb-fd23-4a36-b653-ea3e24a02ee0-kube-api-access-9kgzn\") pod \"dns-operator-744455d44c-v6qbj\" (UID: \"312007fb-fd23-4a36-b653-ea3e24a02ee0\") " pod="openshift-dns-operator/dns-operator-744455d44c-v6qbj" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.515827 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6b1ad070-5898-4b3a-ab57-57d781c9b809-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-x8tds\" (UID: \"6b1ad070-5898-4b3a-ab57-57d781c9b809\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-x8tds" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.515849 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9ded6944-ff06-4cd5-beef-4dbb3cb9aba8-trusted-ca\") pod \"image-registry-697d97f7c8-bmr2l\" (UID: \"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8\") " pod="openshift-image-registry/image-registry-697d97f7c8-bmr2l" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.515863 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jc8z5\" (UniqueName: \"kubernetes.io/projected/348d0b48-f2a9-4326-b8c8-88f43029f382-kube-api-access-jc8z5\") pod \"console-f9d7485db-clwmh\" (UID: \"348d0b48-f2a9-4326-b8c8-88f43029f382\") " pod="openshift-console/console-f9d7485db-clwmh" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.515890 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9ded6944-ff06-4cd5-beef-4dbb3cb9aba8-registry-tls\") pod \"image-registry-697d97f7c8-bmr2l\" (UID: \"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8\") " pod="openshift-image-registry/image-registry-697d97f7c8-bmr2l" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.515914 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bmr2l\" (UID: \"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8\") " pod="openshift-image-registry/image-registry-697d97f7c8-bmr2l" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.515930 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6dc6e77-8617-4bc0-8960-6b81b87c8b88-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-snxxh\" (UID: \"e6dc6e77-8617-4bc0-8960-6b81b87c8b88\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-snxxh" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.515944 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/50f7a0ea-7b15-487b-b907-6fb4c7451eed-serving-cert\") pod \"openshift-config-operator-7777fb866f-lcjcd\" (UID: \"50f7a0ea-7b15-487b-b907-6fb4c7451eed\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-lcjcd" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.515963 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/803e0d1c-f298-49b4-9251-9271f311ee92-audit-dir\") pod \"oauth-openshift-558db77b4-hxxxs\" (UID: \"803e0d1c-f298-49b4-9251-9271f311ee92\") " pod="openshift-authentication/oauth-openshift-558db77b4-hxxxs" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.515977 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/803e0d1c-f298-49b4-9251-9271f311ee92-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-hxxxs\" (UID: \"803e0d1c-f298-49b4-9251-9271f311ee92\") " pod="openshift-authentication/oauth-openshift-558db77b4-hxxxs" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.515992 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/803e0d1c-f298-49b4-9251-9271f311ee92-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-hxxxs\" (UID: \"803e0d1c-f298-49b4-9251-9271f311ee92\") " pod="openshift-authentication/oauth-openshift-558db77b4-hxxxs" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.516006 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/803e0d1c-f298-49b4-9251-9271f311ee92-audit-policies\") pod \"oauth-openshift-558db77b4-hxxxs\" (UID: \"803e0d1c-f298-49b4-9251-9271f311ee92\") " pod="openshift-authentication/oauth-openshift-558db77b4-hxxxs" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.516022 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/803e0d1c-f298-49b4-9251-9271f311ee92-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-hxxxs\" (UID: \"803e0d1c-f298-49b4-9251-9271f311ee92\") " pod="openshift-authentication/oauth-openshift-558db77b4-hxxxs" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.516037 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/568f96c6-6a68-4e06-a1e1-1b787f58bac7-config\") pod \"apiserver-76f77b778f-lqzfp\" (UID: \"568f96c6-6a68-4e06-a1e1-1b787f58bac7\") " pod="openshift-apiserver/apiserver-76f77b778f-lqzfp" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.516052 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/568f96c6-6a68-4e06-a1e1-1b787f58bac7-serving-cert\") pod \"apiserver-76f77b778f-lqzfp\" (UID: \"568f96c6-6a68-4e06-a1e1-1b787f58bac7\") " pod="openshift-apiserver/apiserver-76f77b778f-lqzfp" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.516080 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/effafc66-9dae-4ef3-86a5-72e1fac84fc4-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-jrgwq\" (UID: \"effafc66-9dae-4ef3-86a5-72e1fac84fc4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jrgwq" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.516108 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/803e0d1c-f298-49b4-9251-9271f311ee92-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-hxxxs\" (UID: \"803e0d1c-f298-49b4-9251-9271f311ee92\") " pod="openshift-authentication/oauth-openshift-558db77b4-hxxxs" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.516130 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9ded6944-ff06-4cd5-beef-4dbb3cb9aba8-installation-pull-secrets\") pod \"image-registry-697d97f7c8-bmr2l\" (UID: \"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8\") " pod="openshift-image-registry/image-registry-697d97f7c8-bmr2l" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.516147 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c9967ed-20af-48cf-859d-4c3060d413fb-serving-cert\") pod \"console-operator-58897d9998-qzmkf\" (UID: \"9c9967ed-20af-48cf-859d-4c3060d413fb\") " pod="openshift-console-operator/console-operator-58897d9998-qzmkf" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.516162 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/50f7a0ea-7b15-487b-b907-6fb4c7451eed-available-featuregates\") pod \"openshift-config-operator-7777fb866f-lcjcd\" (UID: \"50f7a0ea-7b15-487b-b907-6fb4c7451eed\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-lcjcd" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.516176 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/803e0d1c-f298-49b4-9251-9271f311ee92-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-hxxxs\" (UID: \"803e0d1c-f298-49b4-9251-9271f311ee92\") " pod="openshift-authentication/oauth-openshift-558db77b4-hxxxs" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.516204 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlwtm\" (UniqueName: \"kubernetes.io/projected/9c9967ed-20af-48cf-859d-4c3060d413fb-kube-api-access-nlwtm\") pod \"console-operator-58897d9998-qzmkf\" (UID: \"9c9967ed-20af-48cf-859d-4c3060d413fb\") " pod="openshift-console-operator/console-operator-58897d9998-qzmkf" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.516218 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/348d0b48-f2a9-4326-b8c8-88f43029f382-console-oauth-config\") pod \"console-f9d7485db-clwmh\" (UID: \"348d0b48-f2a9-4326-b8c8-88f43029f382\") " pod="openshift-console/console-f9d7485db-clwmh" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.516231 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/348d0b48-f2a9-4326-b8c8-88f43029f382-service-ca\") pod \"console-f9d7485db-clwmh\" (UID: \"348d0b48-f2a9-4326-b8c8-88f43029f382\") " pod="openshift-console/console-f9d7485db-clwmh" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.516247 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/803e0d1c-f298-49b4-9251-9271f311ee92-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-hxxxs\" (UID: \"803e0d1c-f298-49b4-9251-9271f311ee92\") " pod="openshift-authentication/oauth-openshift-558db77b4-hxxxs" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.516282 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9ded6944-ff06-4cd5-beef-4dbb3cb9aba8-ca-trust-extracted\") pod \"image-registry-697d97f7c8-bmr2l\" (UID: \"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8\") " pod="openshift-image-registry/image-registry-697d97f7c8-bmr2l" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.516300 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qxvw\" (UniqueName: \"kubernetes.io/projected/e6dc6e77-8617-4bc0-8960-6b81b87c8b88-kube-api-access-6qxvw\") pod \"openshift-controller-manager-operator-756b6f6bc6-snxxh\" (UID: \"e6dc6e77-8617-4bc0-8960-6b81b87c8b88\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-snxxh" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.516321 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/568f96c6-6a68-4e06-a1e1-1b787f58bac7-node-pullsecrets\") pod \"apiserver-76f77b778f-lqzfp\" (UID: \"568f96c6-6a68-4e06-a1e1-1b787f58bac7\") " pod="openshift-apiserver/apiserver-76f77b778f-lqzfp" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.516334 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/568f96c6-6a68-4e06-a1e1-1b787f58bac7-etcd-client\") pod \"apiserver-76f77b778f-lqzfp\" (UID: \"568f96c6-6a68-4e06-a1e1-1b787f58bac7\") " pod="openshift-apiserver/apiserver-76f77b778f-lqzfp" Feb 24 10:20:02 crc kubenswrapper[4698]: E0224 10:20:02.518446 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:20:03.01843611 +0000 UTC m=+228.132050351 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bmr2l" (UID: "9ded6944-ff06-4cd5-beef-4dbb3cb9aba8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.538510 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ff6t5\" (UniqueName: \"kubernetes.io/projected/1539b772-1d04-4bc9-85f1-99a99b5d237d-kube-api-access-ff6t5\") pod \"etcd-operator-b45778765-5ndrw\" (UID: \"1539b772-1d04-4bc9-85f1-99a99b5d237d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5ndrw" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.543540 4698 request.go:700] Waited for 1.332488737s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication-operator/secrets?fieldSelector=metadata.name%3Dauthentication-operator-dockercfg-mz9bj&limit=500&resourceVersion=0 Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.545074 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.565140 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.587156 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.591717 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fccccb67-888f-4a34-a701-61926e9819a6-config\") pod \"machine-api-operator-5694c8668f-vb7wk\" (UID: \"fccccb67-888f-4a34-a701-61926e9819a6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vb7wk" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.604405 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.615956 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/fccccb67-888f-4a34-a701-61926e9819a6-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-vb7wk\" (UID: \"fccccb67-888f-4a34-a701-61926e9819a6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vb7wk" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.617701 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.617977 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/568f96c6-6a68-4e06-a1e1-1b787f58bac7-config\") pod \"apiserver-76f77b778f-lqzfp\" (UID: \"568f96c6-6a68-4e06-a1e1-1b787f58bac7\") " pod="openshift-apiserver/apiserver-76f77b778f-lqzfp" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.618043 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/568f96c6-6a68-4e06-a1e1-1b787f58bac7-serving-cert\") pod \"apiserver-76f77b778f-lqzfp\" (UID: \"568f96c6-6a68-4e06-a1e1-1b787f58bac7\") " pod="openshift-apiserver/apiserver-76f77b778f-lqzfp" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.618096 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/97c3a4a8-9e33-4012-9b16-5a0de6e0ace9-config-volume\") pod \"collect-profiles-29532135-2wtjd\" (UID: \"97c3a4a8-9e33-4012-9b16-5a0de6e0ace9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29532135-2wtjd" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.618146 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/c45160a9-f0cb-4b39-ad29-67c14871973f-signing-cabundle\") pod \"service-ca-9c57cc56f-f69tr\" (UID: \"c45160a9-f0cb-4b39-ad29-67c14871973f\") " pod="openshift-service-ca/service-ca-9c57cc56f-f69tr" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.618213 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/effafc66-9dae-4ef3-86a5-72e1fac84fc4-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-jrgwq\" (UID: \"effafc66-9dae-4ef3-86a5-72e1fac84fc4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jrgwq" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.618301 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jp2q\" (UniqueName: \"kubernetes.io/projected/4a2469de-c5ab-4a39-9168-01e03bd4b1c6-kube-api-access-4jp2q\") pod \"dns-default-6h5bj\" (UID: \"4a2469de-c5ab-4a39-9168-01e03bd4b1c6\") " pod="openshift-dns/dns-default-6h5bj" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.618354 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhwhr\" (UniqueName: \"kubernetes.io/projected/29e7e4f9-c6e0-4a3a-8ec6-5c863c192667-kube-api-access-xhwhr\") pod \"multus-admission-controller-857f4d67dd-fd5xc\" (UID: \"29e7e4f9-c6e0-4a3a-8ec6-5c863c192667\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-fd5xc" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.618446 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vl72l\" (UniqueName: \"kubernetes.io/projected/f5c8edb8-fc4d-440e-94a0-116059aed6ad-kube-api-access-vl72l\") pod \"machine-config-controller-84d6567774-jzvrd\" (UID: \"f5c8edb8-fc4d-440e-94a0-116059aed6ad\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jzvrd" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.618516 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/803e0d1c-f298-49b4-9251-9271f311ee92-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-hxxxs\" (UID: \"803e0d1c-f298-49b4-9251-9271f311ee92\") " pod="openshift-authentication/oauth-openshift-558db77b4-hxxxs" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.618566 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jl59\" (UniqueName: \"kubernetes.io/projected/b681f586-b4e0-4b2a-ab97-ea20583eeb34-kube-api-access-7jl59\") pod \"catalog-operator-68c6474976-bz5kc\" (UID: \"b681f586-b4e0-4b2a-ab97-ea20583eeb34\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bz5kc" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.618611 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e7922021-adba-44fd-aff2-2f0776f3fabe-metrics-tls\") pod \"ingress-operator-5b745b69d9-gnxh7\" (UID: \"e7922021-adba-44fd-aff2-2f0776f3fabe\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gnxh7" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.618659 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c9967ed-20af-48cf-859d-4c3060d413fb-serving-cert\") pod \"console-operator-58897d9998-qzmkf\" (UID: \"9c9967ed-20af-48cf-859d-4c3060d413fb\") " pod="openshift-console-operator/console-operator-58897d9998-qzmkf" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.618706 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/50f7a0ea-7b15-487b-b907-6fb4c7451eed-available-featuregates\") pod \"openshift-config-operator-7777fb866f-lcjcd\" (UID: \"50f7a0ea-7b15-487b-b907-6fb4c7451eed\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-lcjcd" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.618752 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/803e0d1c-f298-49b4-9251-9271f311ee92-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-hxxxs\" (UID: \"803e0d1c-f298-49b4-9251-9271f311ee92\") " pod="openshift-authentication/oauth-openshift-558db77b4-hxxxs" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.618800 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f5c8edb8-fc4d-440e-94a0-116059aed6ad-proxy-tls\") pod \"machine-config-controller-84d6567774-jzvrd\" (UID: \"f5c8edb8-fc4d-440e-94a0-116059aed6ad\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jzvrd" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.618851 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/50b7cda3-dd1c-4644-b5a6-23957a406b19-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-z54pv\" (UID: \"50b7cda3-dd1c-4644-b5a6-23957a406b19\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-z54pv" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.618900 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/04421fdc-439e-4b78-b6ce-fcf8957ddf92-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-shwvb\" (UID: \"04421fdc-439e-4b78-b6ce-fcf8957ddf92\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-shwvb" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.618963 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/348d0b48-f2a9-4326-b8c8-88f43029f382-service-ca\") pod \"console-f9d7485db-clwmh\" (UID: \"348d0b48-f2a9-4326-b8c8-88f43029f382\") " pod="openshift-console/console-f9d7485db-clwmh" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.619018 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/803e0d1c-f298-49b4-9251-9271f311ee92-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-hxxxs\" (UID: \"803e0d1c-f298-49b4-9251-9271f311ee92\") " pod="openshift-authentication/oauth-openshift-558db77b4-hxxxs" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.619080 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9ded6944-ff06-4cd5-beef-4dbb3cb9aba8-ca-trust-extracted\") pod \"image-registry-697d97f7c8-bmr2l\" (UID: \"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8\") " pod="openshift-image-registry/image-registry-697d97f7c8-bmr2l" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.619154 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/568f96c6-6a68-4e06-a1e1-1b787f58bac7-etcd-client\") pod \"apiserver-76f77b778f-lqzfp\" (UID: \"568f96c6-6a68-4e06-a1e1-1b787f58bac7\") " pod="openshift-apiserver/apiserver-76f77b778f-lqzfp" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.619204 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdt2j\" (UniqueName: \"kubernetes.io/projected/97c3a4a8-9e33-4012-9b16-5a0de6e0ace9-kube-api-access-wdt2j\") pod \"collect-profiles-29532135-2wtjd\" (UID: \"97c3a4a8-9e33-4012-9b16-5a0de6e0ace9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29532135-2wtjd" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.619252 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/875da7ec-7eeb-4f5c-b849-73863732ebb2-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-5582h\" (UID: \"875da7ec-7eeb-4f5c-b849-73863732ebb2\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5582h" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.619334 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/b07a9333-815e-464f-afc6-28c1da857d84-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-xcfhs\" (UID: \"b07a9333-815e-464f-afc6-28c1da857d84\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xcfhs" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.619382 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/c45160a9-f0cb-4b39-ad29-67c14871973f-signing-key\") pod \"service-ca-9c57cc56f-f69tr\" (UID: \"c45160a9-f0cb-4b39-ad29-67c14871973f\") " pod="openshift-service-ca/service-ca-9c57cc56f-f69tr" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.621123 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9ded6944-ff06-4cd5-beef-4dbb3cb9aba8-ca-trust-extracted\") pod \"image-registry-697d97f7c8-bmr2l\" (UID: \"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8\") " pod="openshift-image-registry/image-registry-697d97f7c8-bmr2l" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.621223 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/6b1ad070-5898-4b3a-ab57-57d781c9b809-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-x8tds\" (UID: \"6b1ad070-5898-4b3a-ab57-57d781c9b809\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-x8tds" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.621317 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbzx5\" (UniqueName: \"kubernetes.io/projected/effafc66-9dae-4ef3-86a5-72e1fac84fc4-kube-api-access-gbzx5\") pod \"openshift-apiserver-operator-796bbdcf4f-jrgwq\" (UID: \"effafc66-9dae-4ef3-86a5-72e1fac84fc4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jrgwq" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.621358 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/568f96c6-6a68-4e06-a1e1-1b787f58bac7-audit-dir\") pod \"apiserver-76f77b778f-lqzfp\" (UID: \"568f96c6-6a68-4e06-a1e1-1b787f58bac7\") " pod="openshift-apiserver/apiserver-76f77b778f-lqzfp" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.621399 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/348d0b48-f2a9-4326-b8c8-88f43029f382-console-config\") pod \"console-f9d7485db-clwmh\" (UID: \"348d0b48-f2a9-4326-b8c8-88f43029f382\") " pod="openshift-console/console-f9d7485db-clwmh" Feb 24 10:20:02 crc kubenswrapper[4698]: E0224 10:20:02.621434 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:20:03.121403103 +0000 UTC m=+228.235017414 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.621483 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c230438c-2633-4e31-b0da-b1d037e35e0c-apiservice-cert\") pod \"packageserver-d55dfcdfc-5tdjl\" (UID: \"c230438c-2633-4e31-b0da-b1d037e35e0c\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5tdjl" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.621576 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/803e0d1c-f298-49b4-9251-9271f311ee92-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-hxxxs\" (UID: \"803e0d1c-f298-49b4-9251-9271f311ee92\") " pod="openshift-authentication/oauth-openshift-558db77b4-hxxxs" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.621602 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/803e0d1c-f298-49b4-9251-9271f311ee92-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-hxxxs\" (UID: \"803e0d1c-f298-49b4-9251-9271f311ee92\") " pod="openshift-authentication/oauth-openshift-558db77b4-hxxxs" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.621628 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/916d037d-f52e-449e-8496-34695060f8d5-cert\") pod \"ingress-canary-6kd89\" (UID: \"916d037d-f52e-449e-8496-34695060f8d5\") " pod="openshift-ingress-canary/ingress-canary-6kd89" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.621657 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hk7pb\" (UniqueName: \"kubernetes.io/projected/568f96c6-6a68-4e06-a1e1-1b787f58bac7-kube-api-access-hk7pb\") pod \"apiserver-76f77b778f-lqzfp\" (UID: \"568f96c6-6a68-4e06-a1e1-1b787f58bac7\") " pod="openshift-apiserver/apiserver-76f77b778f-lqzfp" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.621682 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f688bfd3-2a09-4640-8ec8-9b69cc9881c4-auth-proxy-config\") pod \"machine-config-operator-74547568cd-w5v5m\" (UID: \"f688bfd3-2a09-4640-8ec8-9b69cc9881c4\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-w5v5m" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.621704 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e7922021-adba-44fd-aff2-2f0776f3fabe-bound-sa-token\") pod \"ingress-operator-5b745b69d9-gnxh7\" (UID: \"e7922021-adba-44fd-aff2-2f0776f3fabe\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gnxh7" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.621745 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/e3a102f6-2a75-4096-806a-7af5eca816e0-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-rkxpp\" (UID: \"e3a102f6-2a75-4096-806a-7af5eca816e0\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rkxpp" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.621773 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9ded6944-ff06-4cd5-beef-4dbb3cb9aba8-bound-sa-token\") pod \"image-registry-697d97f7c8-bmr2l\" (UID: \"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8\") " pod="openshift-image-registry/image-registry-697d97f7c8-bmr2l" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.621797 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/348d0b48-f2a9-4326-b8c8-88f43029f382-console-serving-cert\") pod \"console-f9d7485db-clwmh\" (UID: \"348d0b48-f2a9-4326-b8c8-88f43029f382\") " pod="openshift-console/console-f9d7485db-clwmh" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.621819 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/803e0d1c-f298-49b4-9251-9271f311ee92-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-hxxxs\" (UID: \"803e0d1c-f298-49b4-9251-9271f311ee92\") " pod="openshift-authentication/oauth-openshift-558db77b4-hxxxs" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.621842 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4a2469de-c5ab-4a39-9168-01e03bd4b1c6-metrics-tls\") pod \"dns-default-6h5bj\" (UID: \"4a2469de-c5ab-4a39-9168-01e03bd4b1c6\") " pod="openshift-dns/dns-default-6h5bj" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.621862 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/29e7e4f9-c6e0-4a3a-8ec6-5c863c192667-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-fd5xc\" (UID: \"29e7e4f9-c6e0-4a3a-8ec6-5c863c192667\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-fd5xc" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.621890 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/803e0d1c-f298-49b4-9251-9271f311ee92-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-hxxxs\" (UID: \"803e0d1c-f298-49b4-9251-9271f311ee92\") " pod="openshift-authentication/oauth-openshift-558db77b4-hxxxs" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.621925 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/568f96c6-6a68-4e06-a1e1-1b787f58bac7-image-import-ca\") pod \"apiserver-76f77b778f-lqzfp\" (UID: \"568f96c6-6a68-4e06-a1e1-1b787f58bac7\") " pod="openshift-apiserver/apiserver-76f77b778f-lqzfp" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.621949 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46p6r\" (UniqueName: \"kubernetes.io/projected/875da7ec-7eeb-4f5c-b849-73863732ebb2-kube-api-access-46p6r\") pod \"control-plane-machine-set-operator-78cbb6b69f-5582h\" (UID: \"875da7ec-7eeb-4f5c-b849-73863732ebb2\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5582h" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.621982 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c230438c-2633-4e31-b0da-b1d037e35e0c-webhook-cert\") pod \"packageserver-d55dfcdfc-5tdjl\" (UID: \"c230438c-2633-4e31-b0da-b1d037e35e0c\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5tdjl" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.621996 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/50f7a0ea-7b15-487b-b907-6fb4c7451eed-available-featuregates\") pod \"openshift-config-operator-7777fb866f-lcjcd\" (UID: \"50f7a0ea-7b15-487b-b907-6fb4c7451eed\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-lcjcd" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.622005 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9ded6944-ff06-4cd5-beef-4dbb3cb9aba8-registry-certificates\") pod \"image-registry-697d97f7c8-bmr2l\" (UID: \"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8\") " pod="openshift-image-registry/image-registry-697d97f7c8-bmr2l" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.622065 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/bbfa8949-aef4-4d80-8ece-7af18d74a9a0-profile-collector-cert\") pod \"olm-operator-6b444d44fb-sn5g8\" (UID: \"bbfa8949-aef4-4d80-8ece-7af18d74a9a0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sn5g8" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.622094 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6x7rs\" (UniqueName: \"kubernetes.io/projected/e7922021-adba-44fd-aff2-2f0776f3fabe-kube-api-access-6x7rs\") pod \"ingress-operator-5b745b69d9-gnxh7\" (UID: \"e7922021-adba-44fd-aff2-2f0776f3fabe\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gnxh7" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.622120 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggrxx\" (UniqueName: \"kubernetes.io/projected/aed98b22-f91a-4aba-ab64-65fc09af1478-kube-api-access-ggrxx\") pod \"service-ca-operator-777779d784-csv7z\" (UID: \"aed98b22-f91a-4aba-ab64-65fc09af1478\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-csv7z" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.622154 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74k5q\" (UniqueName: \"kubernetes.io/projected/50f7a0ea-7b15-487b-b907-6fb4c7451eed-kube-api-access-74k5q\") pod \"openshift-config-operator-7777fb866f-lcjcd\" (UID: \"50f7a0ea-7b15-487b-b907-6fb4c7451eed\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-lcjcd" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.622178 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkh9l\" (UniqueName: \"kubernetes.io/projected/803e0d1c-f298-49b4-9251-9271f311ee92-kube-api-access-wkh9l\") pod \"oauth-openshift-558db77b4-hxxxs\" (UID: \"803e0d1c-f298-49b4-9251-9271f311ee92\") " pod="openshift-authentication/oauth-openshift-558db77b4-hxxxs" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.622210 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sl7fn\" (UniqueName: \"kubernetes.io/projected/916d037d-f52e-449e-8496-34695060f8d5-kube-api-access-sl7fn\") pod \"ingress-canary-6kd89\" (UID: \"916d037d-f52e-449e-8496-34695060f8d5\") " pod="openshift-ingress-canary/ingress-canary-6kd89" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.622290 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/19c1f910-f805-4151-8eb8-7a6628a62b5b-registration-dir\") pod \"csi-hostpathplugin-fbt6k\" (UID: \"19c1f910-f805-4151-8eb8-7a6628a62b5b\") " pod="hostpath-provisioner/csi-hostpathplugin-fbt6k" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.622318 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/97c3a4a8-9e33-4012-9b16-5a0de6e0ace9-secret-volume\") pod \"collect-profiles-29532135-2wtjd\" (UID: \"97c3a4a8-9e33-4012-9b16-5a0de6e0ace9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29532135-2wtjd" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.622343 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/c230438c-2633-4e31-b0da-b1d037e35e0c-tmpfs\") pod \"packageserver-d55dfcdfc-5tdjl\" (UID: \"c230438c-2633-4e31-b0da-b1d037e35e0c\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5tdjl" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.622370 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/b44b8c72-3ca2-4fbe-aa3f-9ab7917b1658-default-certificate\") pod \"router-default-5444994796-trk7h\" (UID: \"b44b8c72-3ca2-4fbe-aa3f-9ab7917b1658\") " pod="openshift-ingress/router-default-5444994796-trk7h" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.622416 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9ded6944-ff06-4cd5-beef-4dbb3cb9aba8-trusted-ca\") pod \"image-registry-697d97f7c8-bmr2l\" (UID: \"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8\") " pod="openshift-image-registry/image-registry-697d97f7c8-bmr2l" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.622438 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jc8z5\" (UniqueName: \"kubernetes.io/projected/348d0b48-f2a9-4326-b8c8-88f43029f382-kube-api-access-jc8z5\") pod \"console-f9d7485db-clwmh\" (UID: \"348d0b48-f2a9-4326-b8c8-88f43029f382\") " pod="openshift-console/console-f9d7485db-clwmh" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.622460 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/f8136f49-df98-42ad-98e7-93ddb91c2063-node-bootstrap-token\") pod \"machine-config-server-z2fjr\" (UID: \"f8136f49-df98-42ad-98e7-93ddb91c2063\") " pod="openshift-machine-config-operator/machine-config-server-z2fjr" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.622483 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/f8136f49-df98-42ad-98e7-93ddb91c2063-certs\") pod \"machine-config-server-z2fjr\" (UID: \"f8136f49-df98-42ad-98e7-93ddb91c2063\") " pod="openshift-machine-config-operator/machine-config-server-z2fjr" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.622520 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6dc6e77-8617-4bc0-8960-6b81b87c8b88-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-snxxh\" (UID: \"e6dc6e77-8617-4bc0-8960-6b81b87c8b88\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-snxxh" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.622546 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/50f7a0ea-7b15-487b-b907-6fb4c7451eed-serving-cert\") pod \"openshift-config-operator-7777fb866f-lcjcd\" (UID: \"50f7a0ea-7b15-487b-b907-6fb4c7451eed\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-lcjcd" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.622573 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4a2469de-c5ab-4a39-9168-01e03bd4b1c6-config-volume\") pod \"dns-default-6h5bj\" (UID: \"4a2469de-c5ab-4a39-9168-01e03bd4b1c6\") " pod="openshift-dns/dns-default-6h5bj" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.622611 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmnnk\" (UniqueName: \"kubernetes.io/projected/b44b8c72-3ca2-4fbe-aa3f-9ab7917b1658-kube-api-access-mmnnk\") pod \"router-default-5444994796-trk7h\" (UID: \"b44b8c72-3ca2-4fbe-aa3f-9ab7917b1658\") " pod="openshift-ingress/router-default-5444994796-trk7h" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.622643 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/803e0d1c-f298-49b4-9251-9271f311ee92-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-hxxxs\" (UID: \"803e0d1c-f298-49b4-9251-9271f311ee92\") " pod="openshift-authentication/oauth-openshift-558db77b4-hxxxs" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.622664 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/19c1f910-f805-4151-8eb8-7a6628a62b5b-mountpoint-dir\") pod \"csi-hostpathplugin-fbt6k\" (UID: \"19c1f910-f805-4151-8eb8-7a6628a62b5b\") " pod="hostpath-provisioner/csi-hostpathplugin-fbt6k" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.622706 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/803e0d1c-f298-49b4-9251-9271f311ee92-audit-policies\") pod \"oauth-openshift-558db77b4-hxxxs\" (UID: \"803e0d1c-f298-49b4-9251-9271f311ee92\") " pod="openshift-authentication/oauth-openshift-558db77b4-hxxxs" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.622740 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/803e0d1c-f298-49b4-9251-9271f311ee92-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-hxxxs\" (UID: \"803e0d1c-f298-49b4-9251-9271f311ee92\") " pod="openshift-authentication/oauth-openshift-558db77b4-hxxxs" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.622771 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b44b8c72-3ca2-4fbe-aa3f-9ab7917b1658-metrics-certs\") pod \"router-default-5444994796-trk7h\" (UID: \"b44b8c72-3ca2-4fbe-aa3f-9ab7917b1658\") " pod="openshift-ingress/router-default-5444994796-trk7h" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.622793 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9kdb\" (UniqueName: \"kubernetes.io/projected/19c1f910-f805-4151-8eb8-7a6628a62b5b-kube-api-access-w9kdb\") pod \"csi-hostpathplugin-fbt6k\" (UID: \"19c1f910-f805-4151-8eb8-7a6628a62b5b\") " pod="hostpath-provisioner/csi-hostpathplugin-fbt6k" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.622815 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzjzw\" (UniqueName: \"kubernetes.io/projected/f688bfd3-2a09-4640-8ec8-9b69cc9881c4-kube-api-access-hzjzw\") pod \"machine-config-operator-74547568cd-w5v5m\" (UID: \"f688bfd3-2a09-4640-8ec8-9b69cc9881c4\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-w5v5m" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.622845 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rbbl\" (UniqueName: \"kubernetes.io/projected/50b7cda3-dd1c-4644-b5a6-23957a406b19-kube-api-access-4rbbl\") pod \"kube-storage-version-migrator-operator-b67b599dd-z54pv\" (UID: \"50b7cda3-dd1c-4644-b5a6-23957a406b19\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-z54pv" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.622865 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f688bfd3-2a09-4640-8ec8-9b69cc9881c4-proxy-tls\") pod \"machine-config-operator-74547568cd-w5v5m\" (UID: \"f688bfd3-2a09-4640-8ec8-9b69cc9881c4\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-w5v5m" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.622887 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae1f2a5a-7833-46ff-b7e0-d8c2b70ec300-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-97zrr\" (UID: \"ae1f2a5a-7833-46ff-b7e0-d8c2b70ec300\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-97zrr" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.622912 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9ded6944-ff06-4cd5-beef-4dbb3cb9aba8-installation-pull-secrets\") pod \"image-registry-697d97f7c8-bmr2l\" (UID: \"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8\") " pod="openshift-image-registry/image-registry-697d97f7c8-bmr2l" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.622936 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d0de08e0-63c0-4a90-a264-1bc41b8746d8-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-79f62\" (UID: \"d0de08e0-63c0-4a90-a264-1bc41b8746d8\") " pod="openshift-marketplace/marketplace-operator-79b997595-79f62" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.622961 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlwtm\" (UniqueName: \"kubernetes.io/projected/9c9967ed-20af-48cf-859d-4c3060d413fb-kube-api-access-nlwtm\") pod \"console-operator-58897d9998-qzmkf\" (UID: \"9c9967ed-20af-48cf-859d-4c3060d413fb\") " pod="openshift-console-operator/console-operator-58897d9998-qzmkf" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.622983 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/348d0b48-f2a9-4326-b8c8-88f43029f382-console-oauth-config\") pod \"console-f9d7485db-clwmh\" (UID: \"348d0b48-f2a9-4326-b8c8-88f43029f382\") " pod="openshift-console/console-f9d7485db-clwmh" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.623006 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b681f586-b4e0-4b2a-ab97-ea20583eeb34-profile-collector-cert\") pod \"catalog-operator-68c6474976-bz5kc\" (UID: \"b681f586-b4e0-4b2a-ab97-ea20583eeb34\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bz5kc" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.623041 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qxvw\" (UniqueName: \"kubernetes.io/projected/e6dc6e77-8617-4bc0-8960-6b81b87c8b88-kube-api-access-6qxvw\") pod \"openshift-controller-manager-operator-756b6f6bc6-snxxh\" (UID: \"e6dc6e77-8617-4bc0-8960-6b81b87c8b88\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-snxxh" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.623068 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/568f96c6-6a68-4e06-a1e1-1b787f58bac7-node-pullsecrets\") pod \"apiserver-76f77b778f-lqzfp\" (UID: \"568f96c6-6a68-4e06-a1e1-1b787f58bac7\") " pod="openshift-apiserver/apiserver-76f77b778f-lqzfp" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.623093 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ae1f2a5a-7833-46ff-b7e0-d8c2b70ec300-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-97zrr\" (UID: \"ae1f2a5a-7833-46ff-b7e0-d8c2b70ec300\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-97zrr" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.623114 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xxpc\" (UniqueName: \"kubernetes.io/projected/bbfa8949-aef4-4d80-8ece-7af18d74a9a0-kube-api-access-6xxpc\") pod \"olm-operator-6b444d44fb-sn5g8\" (UID: \"bbfa8949-aef4-4d80-8ece-7af18d74a9a0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sn5g8" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.623136 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b681f586-b4e0-4b2a-ab97-ea20583eeb34-srv-cert\") pod \"catalog-operator-68c6474976-bz5kc\" (UID: \"b681f586-b4e0-4b2a-ab97-ea20583eeb34\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bz5kc" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.623151 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9ded6944-ff06-4cd5-beef-4dbb3cb9aba8-registry-certificates\") pod \"image-registry-697d97f7c8-bmr2l\" (UID: \"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8\") " pod="openshift-image-registry/image-registry-697d97f7c8-bmr2l" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.623157 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d0de08e0-63c0-4a90-a264-1bc41b8746d8-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-79f62\" (UID: \"d0de08e0-63c0-4a90-a264-1bc41b8746d8\") " pod="openshift-marketplace/marketplace-operator-79b997595-79f62" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.623216 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/348d0b48-f2a9-4326-b8c8-88f43029f382-trusted-ca-bundle\") pod \"console-f9d7485db-clwmh\" (UID: \"348d0b48-f2a9-4326-b8c8-88f43029f382\") " pod="openshift-console/console-f9d7485db-clwmh" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.623250 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/04421fdc-439e-4b78-b6ce-fcf8957ddf92-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-shwvb\" (UID: \"04421fdc-439e-4b78-b6ce-fcf8957ddf92\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-shwvb" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.623309 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ae1f2a5a-7833-46ff-b7e0-d8c2b70ec300-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-97zrr\" (UID: \"ae1f2a5a-7833-46ff-b7e0-d8c2b70ec300\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-97zrr" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.623337 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/568f96c6-6a68-4e06-a1e1-1b787f58bac7-audit\") pod \"apiserver-76f77b778f-lqzfp\" (UID: \"568f96c6-6a68-4e06-a1e1-1b787f58bac7\") " pod="openshift-apiserver/apiserver-76f77b778f-lqzfp" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.623362 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/19c1f910-f805-4151-8eb8-7a6628a62b5b-socket-dir\") pod \"csi-hostpathplugin-fbt6k\" (UID: \"19c1f910-f805-4151-8eb8-7a6628a62b5b\") " pod="hostpath-provisioner/csi-hostpathplugin-fbt6k" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.623389 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4rfn\" (UniqueName: \"kubernetes.io/projected/e3a102f6-2a75-4096-806a-7af5eca816e0-kube-api-access-b4rfn\") pod \"package-server-manager-789f6589d5-rkxpp\" (UID: \"e3a102f6-2a75-4096-806a-7af5eca816e0\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rkxpp" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.623427 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aed98b22-f91a-4aba-ab64-65fc09af1478-config\") pod \"service-ca-operator-777779d784-csv7z\" (UID: \"aed98b22-f91a-4aba-ab64-65fc09af1478\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-csv7z" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.623454 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wzkz\" (UniqueName: \"kubernetes.io/projected/9ded6944-ff06-4cd5-beef-4dbb3cb9aba8-kube-api-access-5wzkz\") pod \"image-registry-697d97f7c8-bmr2l\" (UID: \"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8\") " pod="openshift-image-registry/image-registry-697d97f7c8-bmr2l" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.623483 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kr8mq\" (UniqueName: \"kubernetes.io/projected/6b1ad070-5898-4b3a-ab57-57d781c9b809-kube-api-access-kr8mq\") pod \"cluster-image-registry-operator-dc59b4c8b-x8tds\" (UID: \"6b1ad070-5898-4b3a-ab57-57d781c9b809\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-x8tds" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.623510 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aed98b22-f91a-4aba-ab64-65fc09af1478-serving-cert\") pod \"service-ca-operator-777779d784-csv7z\" (UID: \"aed98b22-f91a-4aba-ab64-65fc09af1478\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-csv7z" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.623533 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pk54\" (UniqueName: \"kubernetes.io/projected/f8136f49-df98-42ad-98e7-93ddb91c2063-kube-api-access-9pk54\") pod \"machine-config-server-z2fjr\" (UID: \"f8136f49-df98-42ad-98e7-93ddb91c2063\") " pod="openshift-machine-config-operator/machine-config-server-z2fjr" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.623558 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/568f96c6-6a68-4e06-a1e1-1b787f58bac7-trusted-ca-bundle\") pod \"apiserver-76f77b778f-lqzfp\" (UID: \"568f96c6-6a68-4e06-a1e1-1b787f58bac7\") " pod="openshift-apiserver/apiserver-76f77b778f-lqzfp" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.623581 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgw9l\" (UniqueName: \"kubernetes.io/projected/c45160a9-f0cb-4b39-ad29-67c14871973f-kube-api-access-wgw9l\") pod \"service-ca-9c57cc56f-f69tr\" (UID: \"c45160a9-f0cb-4b39-ad29-67c14871973f\") " pod="openshift-service-ca/service-ca-9c57cc56f-f69tr" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.623604 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/133fe0e9-d231-45aa-bea8-6add237cffb4-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-265tl\" (UID: \"133fe0e9-d231-45aa-bea8-6add237cffb4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-265tl" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.623626 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/133fe0e9-d231-45aa-bea8-6add237cffb4-config\") pod \"kube-apiserver-operator-766d6c64bb-265tl\" (UID: \"133fe0e9-d231-45aa-bea8-6add237cffb4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-265tl" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.623653 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/348d0b48-f2a9-4326-b8c8-88f43029f382-oauth-serving-cert\") pod \"console-f9d7485db-clwmh\" (UID: \"348d0b48-f2a9-4326-b8c8-88f43029f382\") " pod="openshift-console/console-f9d7485db-clwmh" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.623677 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/312007fb-fd23-4a36-b653-ea3e24a02ee0-metrics-tls\") pod \"dns-operator-744455d44c-v6qbj\" (UID: \"312007fb-fd23-4a36-b653-ea3e24a02ee0\") " pod="openshift-dns-operator/dns-operator-744455d44c-v6qbj" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.623705 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c9967ed-20af-48cf-859d-4c3060d413fb-config\") pod \"console-operator-58897d9998-qzmkf\" (UID: \"9c9967ed-20af-48cf-859d-4c3060d413fb\") " pod="openshift-console-operator/console-operator-58897d9998-qzmkf" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.623730 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04421fdc-439e-4b78-b6ce-fcf8957ddf92-config\") pod \"kube-controller-manager-operator-78b949d7b-shwvb\" (UID: \"04421fdc-439e-4b78-b6ce-fcf8957ddf92\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-shwvb" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.623756 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9c9967ed-20af-48cf-859d-4c3060d413fb-trusted-ca\") pod \"console-operator-58897d9998-qzmkf\" (UID: \"9c9967ed-20af-48cf-859d-4c3060d413fb\") " pod="openshift-console-operator/console-operator-58897d9998-qzmkf" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.623790 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b44b8c72-3ca2-4fbe-aa3f-9ab7917b1658-service-ca-bundle\") pod \"router-default-5444994796-trk7h\" (UID: \"b44b8c72-3ca2-4fbe-aa3f-9ab7917b1658\") " pod="openshift-ingress/router-default-5444994796-trk7h" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.623823 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6b1ad070-5898-4b3a-ab57-57d781c9b809-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-x8tds\" (UID: \"6b1ad070-5898-4b3a-ab57-57d781c9b809\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-x8tds" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.623847 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/effafc66-9dae-4ef3-86a5-72e1fac84fc4-config\") pod \"openshift-apiserver-operator-796bbdcf4f-jrgwq\" (UID: \"effafc66-9dae-4ef3-86a5-72e1fac84fc4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jrgwq" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.623871 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50b7cda3-dd1c-4644-b5a6-23957a406b19-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-z54pv\" (UID: \"50b7cda3-dd1c-4644-b5a6-23957a406b19\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-z54pv" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.623892 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/b44b8c72-3ca2-4fbe-aa3f-9ab7917b1658-stats-auth\") pod \"router-default-5444994796-trk7h\" (UID: \"b44b8c72-3ca2-4fbe-aa3f-9ab7917b1658\") " pod="openshift-ingress/router-default-5444994796-trk7h" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.623914 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f688bfd3-2a09-4640-8ec8-9b69cc9881c4-images\") pod \"machine-config-operator-74547568cd-w5v5m\" (UID: \"f688bfd3-2a09-4640-8ec8-9b69cc9881c4\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-w5v5m" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.623937 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdgfd\" (UniqueName: \"kubernetes.io/projected/108d72f5-0dd9-4965-a41f-7403ad8fce04-kube-api-access-mdgfd\") pod \"downloads-7954f5f757-z42jf\" (UID: \"108d72f5-0dd9-4965-a41f-7403ad8fce04\") " pod="openshift-console/downloads-7954f5f757-z42jf" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.623959 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e7922021-adba-44fd-aff2-2f0776f3fabe-trusted-ca\") pod \"ingress-operator-5b745b69d9-gnxh7\" (UID: \"e7922021-adba-44fd-aff2-2f0776f3fabe\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gnxh7" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.623984 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/568f96c6-6a68-4e06-a1e1-1b787f58bac7-encryption-config\") pod \"apiserver-76f77b778f-lqzfp\" (UID: \"568f96c6-6a68-4e06-a1e1-1b787f58bac7\") " pod="openshift-apiserver/apiserver-76f77b778f-lqzfp" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.624033 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/bbfa8949-aef4-4d80-8ece-7af18d74a9a0-srv-cert\") pod \"olm-operator-6b444d44fb-sn5g8\" (UID: \"bbfa8949-aef4-4d80-8ece-7af18d74a9a0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sn5g8" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.624058 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvmx4\" (UniqueName: \"kubernetes.io/projected/d0de08e0-63c0-4a90-a264-1bc41b8746d8-kube-api-access-kvmx4\") pod \"marketplace-operator-79b997595-79f62\" (UID: \"d0de08e0-63c0-4a90-a264-1bc41b8746d8\") " pod="openshift-marketplace/marketplace-operator-79b997595-79f62" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.624114 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/803e0d1c-f298-49b4-9251-9271f311ee92-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-hxxxs\" (UID: \"803e0d1c-f298-49b4-9251-9271f311ee92\") " pod="openshift-authentication/oauth-openshift-558db77b4-hxxxs" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.624116 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/348d0b48-f2a9-4326-b8c8-88f43029f382-console-config\") pod \"console-f9d7485db-clwmh\" (UID: \"348d0b48-f2a9-4326-b8c8-88f43029f382\") " pod="openshift-console/console-f9d7485db-clwmh" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.624139 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/568f96c6-6a68-4e06-a1e1-1b787f58bac7-etcd-serving-ca\") pod \"apiserver-76f77b778f-lqzfp\" (UID: \"568f96c6-6a68-4e06-a1e1-1b787f58bac7\") " pod="openshift-apiserver/apiserver-76f77b778f-lqzfp" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.624164 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f5c8edb8-fc4d-440e-94a0-116059aed6ad-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-jzvrd\" (UID: \"f5c8edb8-fc4d-440e-94a0-116059aed6ad\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jzvrd" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.624187 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/19c1f910-f805-4151-8eb8-7a6628a62b5b-plugins-dir\") pod \"csi-hostpathplugin-fbt6k\" (UID: \"19c1f910-f805-4151-8eb8-7a6628a62b5b\") " pod="hostpath-provisioner/csi-hostpathplugin-fbt6k" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.624215 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e6dc6e77-8617-4bc0-8960-6b81b87c8b88-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-snxxh\" (UID: \"e6dc6e77-8617-4bc0-8960-6b81b87c8b88\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-snxxh" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.624240 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9kgzn\" (UniqueName: \"kubernetes.io/projected/312007fb-fd23-4a36-b653-ea3e24a02ee0-kube-api-access-9kgzn\") pod \"dns-operator-744455d44c-v6qbj\" (UID: \"312007fb-fd23-4a36-b653-ea3e24a02ee0\") " pod="openshift-dns-operator/dns-operator-744455d44c-v6qbj" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.624291 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6b1ad070-5898-4b3a-ab57-57d781c9b809-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-x8tds\" (UID: \"6b1ad070-5898-4b3a-ab57-57d781c9b809\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-x8tds" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.624316 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjxkt\" (UniqueName: \"kubernetes.io/projected/0afb3450-a53f-4b58-9472-7bbec9f4eb54-kube-api-access-pjxkt\") pod \"migrator-59844c95c7-skflh\" (UID: \"0afb3450-a53f-4b58-9472-7bbec9f4eb54\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-skflh" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.624352 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9ded6944-ff06-4cd5-beef-4dbb3cb9aba8-registry-tls\") pod \"image-registry-697d97f7c8-bmr2l\" (UID: \"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8\") " pod="openshift-image-registry/image-registry-697d97f7c8-bmr2l" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.624376 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/133fe0e9-d231-45aa-bea8-6add237cffb4-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-265tl\" (UID: \"133fe0e9-d231-45aa-bea8-6add237cffb4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-265tl" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.624426 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bmr2l\" (UID: \"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8\") " pod="openshift-image-registry/image-registry-697d97f7c8-bmr2l" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.624452 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bmbt\" (UniqueName: \"kubernetes.io/projected/b07a9333-815e-464f-afc6-28c1da857d84-kube-api-access-6bmbt\") pod \"cluster-samples-operator-665b6dd947-xcfhs\" (UID: \"b07a9333-815e-464f-afc6-28c1da857d84\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xcfhs" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.624479 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/803e0d1c-f298-49b4-9251-9271f311ee92-audit-dir\") pod \"oauth-openshift-558db77b4-hxxxs\" (UID: \"803e0d1c-f298-49b4-9251-9271f311ee92\") " pod="openshift-authentication/oauth-openshift-558db77b4-hxxxs" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.624503 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/803e0d1c-f298-49b4-9251-9271f311ee92-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-hxxxs\" (UID: \"803e0d1c-f298-49b4-9251-9271f311ee92\") " pod="openshift-authentication/oauth-openshift-558db77b4-hxxxs" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.624529 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2pb2\" (UniqueName: \"kubernetes.io/projected/c230438c-2633-4e31-b0da-b1d037e35e0c-kube-api-access-l2pb2\") pod \"packageserver-d55dfcdfc-5tdjl\" (UID: \"c230438c-2633-4e31-b0da-b1d037e35e0c\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5tdjl" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.624556 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/19c1f910-f805-4151-8eb8-7a6628a62b5b-csi-data-dir\") pod \"csi-hostpathplugin-fbt6k\" (UID: \"19c1f910-f805-4151-8eb8-7a6628a62b5b\") " pod="hostpath-provisioner/csi-hostpathplugin-fbt6k" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.624679 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9ded6944-ff06-4cd5-beef-4dbb3cb9aba8-trusted-ca\") pod \"image-registry-697d97f7c8-bmr2l\" (UID: \"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8\") " pod="openshift-image-registry/image-registry-697d97f7c8-bmr2l" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.625014 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/effafc66-9dae-4ef3-86a5-72e1fac84fc4-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-jrgwq\" (UID: \"effafc66-9dae-4ef3-86a5-72e1fac84fc4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jrgwq" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.625152 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/803e0d1c-f298-49b4-9251-9271f311ee92-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-hxxxs\" (UID: \"803e0d1c-f298-49b4-9251-9271f311ee92\") " pod="openshift-authentication/oauth-openshift-558db77b4-hxxxs" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.625229 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/568f96c6-6a68-4e06-a1e1-1b787f58bac7-audit-dir\") pod \"apiserver-76f77b778f-lqzfp\" (UID: \"568f96c6-6a68-4e06-a1e1-1b787f58bac7\") " pod="openshift-apiserver/apiserver-76f77b778f-lqzfp" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.626383 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6b1ad070-5898-4b3a-ab57-57d781c9b809-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-x8tds\" (UID: \"6b1ad070-5898-4b3a-ab57-57d781c9b809\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-x8tds" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.626575 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/803e0d1c-f298-49b4-9251-9271f311ee92-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-hxxxs\" (UID: \"803e0d1c-f298-49b4-9251-9271f311ee92\") " pod="openshift-authentication/oauth-openshift-558db77b4-hxxxs" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.626797 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/803e0d1c-f298-49b4-9251-9271f311ee92-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-hxxxs\" (UID: \"803e0d1c-f298-49b4-9251-9271f311ee92\") " pod="openshift-authentication/oauth-openshift-558db77b4-hxxxs" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.627004 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/568f96c6-6a68-4e06-a1e1-1b787f58bac7-etcd-serving-ca\") pod \"apiserver-76f77b778f-lqzfp\" (UID: \"568f96c6-6a68-4e06-a1e1-1b787f58bac7\") " pod="openshift-apiserver/apiserver-76f77b778f-lqzfp" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.627594 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/568f96c6-6a68-4e06-a1e1-1b787f58bac7-image-import-ca\") pod \"apiserver-76f77b778f-lqzfp\" (UID: \"568f96c6-6a68-4e06-a1e1-1b787f58bac7\") " pod="openshift-apiserver/apiserver-76f77b778f-lqzfp" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.627664 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/568f96c6-6a68-4e06-a1e1-1b787f58bac7-etcd-client\") pod \"apiserver-76f77b778f-lqzfp\" (UID: \"568f96c6-6a68-4e06-a1e1-1b787f58bac7\") " pod="openshift-apiserver/apiserver-76f77b778f-lqzfp" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.627995 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/803e0d1c-f298-49b4-9251-9271f311ee92-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-hxxxs\" (UID: \"803e0d1c-f298-49b4-9251-9271f311ee92\") " pod="openshift-authentication/oauth-openshift-558db77b4-hxxxs" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.628062 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9ded6944-ff06-4cd5-beef-4dbb3cb9aba8-installation-pull-secrets\") pod \"image-registry-697d97f7c8-bmr2l\" (UID: \"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8\") " pod="openshift-image-registry/image-registry-697d97f7c8-bmr2l" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.628099 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/568f96c6-6a68-4e06-a1e1-1b787f58bac7-config\") pod \"apiserver-76f77b778f-lqzfp\" (UID: \"568f96c6-6a68-4e06-a1e1-1b787f58bac7\") " pod="openshift-apiserver/apiserver-76f77b778f-lqzfp" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.628713 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/568f96c6-6a68-4e06-a1e1-1b787f58bac7-audit\") pod \"apiserver-76f77b778f-lqzfp\" (UID: \"568f96c6-6a68-4e06-a1e1-1b787f58bac7\") " pod="openshift-apiserver/apiserver-76f77b778f-lqzfp" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.629069 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c9967ed-20af-48cf-859d-4c3060d413fb-config\") pod \"console-operator-58897d9998-qzmkf\" (UID: \"9c9967ed-20af-48cf-859d-4c3060d413fb\") " pod="openshift-console-operator/console-operator-58897d9998-qzmkf" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.629132 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/312007fb-fd23-4a36-b653-ea3e24a02ee0-metrics-tls\") pod \"dns-operator-744455d44c-v6qbj\" (UID: \"312007fb-fd23-4a36-b653-ea3e24a02ee0\") " pod="openshift-dns-operator/dns-operator-744455d44c-v6qbj" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.629857 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/803e0d1c-f298-49b4-9251-9271f311ee92-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-hxxxs\" (UID: \"803e0d1c-f298-49b4-9251-9271f311ee92\") " pod="openshift-authentication/oauth-openshift-558db77b4-hxxxs" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.629897 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/803e0d1c-f298-49b4-9251-9271f311ee92-audit-dir\") pod \"oauth-openshift-558db77b4-hxxxs\" (UID: \"803e0d1c-f298-49b4-9251-9271f311ee92\") " pod="openshift-authentication/oauth-openshift-558db77b4-hxxxs" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.629954 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/568f96c6-6a68-4e06-a1e1-1b787f58bac7-trusted-ca-bundle\") pod \"apiserver-76f77b778f-lqzfp\" (UID: \"568f96c6-6a68-4e06-a1e1-1b787f58bac7\") " pod="openshift-apiserver/apiserver-76f77b778f-lqzfp" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.630094 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9c9967ed-20af-48cf-859d-4c3060d413fb-trusted-ca\") pod \"console-operator-58897d9998-qzmkf\" (UID: \"9c9967ed-20af-48cf-859d-4c3060d413fb\") " pod="openshift-console-operator/console-operator-58897d9998-qzmkf" Feb 24 10:20:02 crc kubenswrapper[4698]: E0224 10:20:02.630122 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:20:03.130109305 +0000 UTC m=+228.243723606 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bmr2l" (UID: "9ded6944-ff06-4cd5-beef-4dbb3cb9aba8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.630696 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.630740 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/568f96c6-6a68-4e06-a1e1-1b787f58bac7-encryption-config\") pod \"apiserver-76f77b778f-lqzfp\" (UID: \"568f96c6-6a68-4e06-a1e1-1b787f58bac7\") " pod="openshift-apiserver/apiserver-76f77b778f-lqzfp" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.630923 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/568f96c6-6a68-4e06-a1e1-1b787f58bac7-node-pullsecrets\") pod \"apiserver-76f77b778f-lqzfp\" (UID: \"568f96c6-6a68-4e06-a1e1-1b787f58bac7\") " pod="openshift-apiserver/apiserver-76f77b778f-lqzfp" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.631393 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/348d0b48-f2a9-4326-b8c8-88f43029f382-console-oauth-config\") pod \"console-f9d7485db-clwmh\" (UID: \"348d0b48-f2a9-4326-b8c8-88f43029f382\") " pod="openshift-console/console-f9d7485db-clwmh" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.631430 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/effafc66-9dae-4ef3-86a5-72e1fac84fc4-config\") pod \"openshift-apiserver-operator-796bbdcf4f-jrgwq\" (UID: \"effafc66-9dae-4ef3-86a5-72e1fac84fc4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jrgwq" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.631465 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/348d0b48-f2a9-4326-b8c8-88f43029f382-trusted-ca-bundle\") pod \"console-f9d7485db-clwmh\" (UID: \"348d0b48-f2a9-4326-b8c8-88f43029f382\") " pod="openshift-console/console-f9d7485db-clwmh" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.632887 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/348d0b48-f2a9-4326-b8c8-88f43029f382-service-ca\") pod \"console-f9d7485db-clwmh\" (UID: \"348d0b48-f2a9-4326-b8c8-88f43029f382\") " pod="openshift-console/console-f9d7485db-clwmh" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.632956 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/568f96c6-6a68-4e06-a1e1-1b787f58bac7-serving-cert\") pod \"apiserver-76f77b778f-lqzfp\" (UID: \"568f96c6-6a68-4e06-a1e1-1b787f58bac7\") " pod="openshift-apiserver/apiserver-76f77b778f-lqzfp" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.633950 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/803e0d1c-f298-49b4-9251-9271f311ee92-audit-policies\") pod \"oauth-openshift-558db77b4-hxxxs\" (UID: \"803e0d1c-f298-49b4-9251-9271f311ee92\") " pod="openshift-authentication/oauth-openshift-558db77b4-hxxxs" Feb 24 10:20:02 crc kubenswrapper[4698]: E0224 10:20:02.635793 4698 projected.go:194] Error preparing data for projected volume kube-api-access-kb89h for pod openshift-route-controller-manager/route-controller-manager-6576b87f9c-bfmpl: failed to sync configmap cache: timed out waiting for the condition Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.636229 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c9967ed-20af-48cf-859d-4c3060d413fb-serving-cert\") pod \"console-operator-58897d9998-qzmkf\" (UID: \"9c9967ed-20af-48cf-859d-4c3060d413fb\") " pod="openshift-console-operator/console-operator-58897d9998-qzmkf" Feb 24 10:20:02 crc kubenswrapper[4698]: E0224 10:20:02.636688 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7141f48f-7462-4e1a-a90f-b8ff3b9a8d9f-kube-api-access-kb89h podName:7141f48f-7462-4e1a-a90f-b8ff3b9a8d9f nodeName:}" failed. No retries permitted until 2026-02-24 10:20:03.136647717 +0000 UTC m=+228.250262038 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-kb89h" (UniqueName: "kubernetes.io/projected/7141f48f-7462-4e1a-a90f-b8ff3b9a8d9f-kube-api-access-kb89h") pod "route-controller-manager-6576b87f9c-bfmpl" (UID: "7141f48f-7462-4e1a-a90f-b8ff3b9a8d9f") : failed to sync configmap cache: timed out waiting for the condition Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.636782 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6dc6e77-8617-4bc0-8960-6b81b87c8b88-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-snxxh\" (UID: \"e6dc6e77-8617-4bc0-8960-6b81b87c8b88\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-snxxh" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.636890 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/348d0b48-f2a9-4326-b8c8-88f43029f382-oauth-serving-cert\") pod \"console-f9d7485db-clwmh\" (UID: \"348d0b48-f2a9-4326-b8c8-88f43029f382\") " pod="openshift-console/console-f9d7485db-clwmh" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.636904 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/803e0d1c-f298-49b4-9251-9271f311ee92-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-hxxxs\" (UID: \"803e0d1c-f298-49b4-9251-9271f311ee92\") " pod="openshift-authentication/oauth-openshift-558db77b4-hxxxs" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.637828 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e6dc6e77-8617-4bc0-8960-6b81b87c8b88-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-snxxh\" (UID: \"e6dc6e77-8617-4bc0-8960-6b81b87c8b88\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-snxxh" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.637828 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/803e0d1c-f298-49b4-9251-9271f311ee92-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-hxxxs\" (UID: \"803e0d1c-f298-49b4-9251-9271f311ee92\") " pod="openshift-authentication/oauth-openshift-558db77b4-hxxxs" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.638740 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/803e0d1c-f298-49b4-9251-9271f311ee92-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-hxxxs\" (UID: \"803e0d1c-f298-49b4-9251-9271f311ee92\") " pod="openshift-authentication/oauth-openshift-558db77b4-hxxxs" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.638772 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/803e0d1c-f298-49b4-9251-9271f311ee92-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-hxxxs\" (UID: \"803e0d1c-f298-49b4-9251-9271f311ee92\") " pod="openshift-authentication/oauth-openshift-558db77b4-hxxxs" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.641980 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9ded6944-ff06-4cd5-beef-4dbb3cb9aba8-registry-tls\") pod \"image-registry-697d97f7c8-bmr2l\" (UID: \"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8\") " pod="openshift-image-registry/image-registry-697d97f7c8-bmr2l" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.643001 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/348d0b48-f2a9-4326-b8c8-88f43029f382-console-serving-cert\") pod \"console-f9d7485db-clwmh\" (UID: \"348d0b48-f2a9-4326-b8c8-88f43029f382\") " pod="openshift-console/console-f9d7485db-clwmh" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.643165 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/6b1ad070-5898-4b3a-ab57-57d781c9b809-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-x8tds\" (UID: \"6b1ad070-5898-4b3a-ab57-57d781c9b809\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-x8tds" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.643629 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/803e0d1c-f298-49b4-9251-9271f311ee92-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-hxxxs\" (UID: \"803e0d1c-f298-49b4-9251-9271f311ee92\") " pod="openshift-authentication/oauth-openshift-558db77b4-hxxxs" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.644020 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/50f7a0ea-7b15-487b-b907-6fb4c7451eed-serving-cert\") pod \"openshift-config-operator-7777fb866f-lcjcd\" (UID: \"50f7a0ea-7b15-487b-b907-6fb4c7451eed\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-lcjcd" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.644450 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/803e0d1c-f298-49b4-9251-9271f311ee92-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-hxxxs\" (UID: \"803e0d1c-f298-49b4-9251-9271f311ee92\") " pod="openshift-authentication/oauth-openshift-558db77b4-hxxxs" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.644559 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.644615 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6bh9j" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.652031 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-5ndrw" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.664148 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.684649 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.692376 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/38bdf14d-35ac-440b-9a16-9a4ddd53df34-client-ca\") pod \"controller-manager-879f6c89f-w8bmq\" (UID: \"38bdf14d-35ac-440b-9a16-9a4ddd53df34\") " pod="openshift-controller-manager/controller-manager-879f6c89f-w8bmq" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.696537 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/fccccb67-888f-4a34-a701-61926e9819a6-images\") pod \"machine-api-operator-5694c8668f-vb7wk\" (UID: \"fccccb67-888f-4a34-a701-61926e9819a6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vb7wk" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.706329 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.712817 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/59c9844c-00fe-42cd-add6-9ab528da273d-serving-cert\") pod \"authentication-operator-69f744f599-fq25r\" (UID: \"59c9844c-00fe-42cd-add6-9ab528da273d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fq25r" Feb 24 10:20:02 crc kubenswrapper[4698]: E0224 10:20:02.716988 4698 projected.go:288] Couldn't get configMap openshift-authentication-operator/openshift-service-ca.crt: failed to sync configmap cache: timed out waiting for the condition Feb 24 10:20:02 crc kubenswrapper[4698]: E0224 10:20:02.717011 4698 projected.go:194] Error preparing data for projected volume kube-api-access-zk9sk for pod openshift-authentication-operator/authentication-operator-69f744f599-fq25r: failed to sync configmap cache: timed out waiting for the condition Feb 24 10:20:02 crc kubenswrapper[4698]: E0224 10:20:02.717055 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/59c9844c-00fe-42cd-add6-9ab528da273d-kube-api-access-zk9sk podName:59c9844c-00fe-42cd-add6-9ab528da273d nodeName:}" failed. No retries permitted until 2026-02-24 10:20:03.217041805 +0000 UTC m=+228.330656046 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-zk9sk" (UniqueName: "kubernetes.io/projected/59c9844c-00fe-42cd-add6-9ab528da273d-kube-api-access-zk9sk") pod "authentication-operator-69f744f599-fq25r" (UID: "59c9844c-00fe-42cd-add6-9ab528da273d") : failed to sync configmap cache: timed out waiting for the condition Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.725302 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.725608 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:20:02 crc kubenswrapper[4698]: E0224 10:20:02.725740 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:20:03.225725176 +0000 UTC m=+228.339339417 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.725794 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d0de08e0-63c0-4a90-a264-1bc41b8746d8-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-79f62\" (UID: \"d0de08e0-63c0-4a90-a264-1bc41b8746d8\") " pod="openshift-marketplace/marketplace-operator-79b997595-79f62" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.725834 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b681f586-b4e0-4b2a-ab97-ea20583eeb34-profile-collector-cert\") pod \"catalog-operator-68c6474976-bz5kc\" (UID: \"b681f586-b4e0-4b2a-ab97-ea20583eeb34\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bz5kc" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.725861 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ae1f2a5a-7833-46ff-b7e0-d8c2b70ec300-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-97zrr\" (UID: \"ae1f2a5a-7833-46ff-b7e0-d8c2b70ec300\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-97zrr" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.725891 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xxpc\" (UniqueName: \"kubernetes.io/projected/bbfa8949-aef4-4d80-8ece-7af18d74a9a0-kube-api-access-6xxpc\") pod \"olm-operator-6b444d44fb-sn5g8\" (UID: \"bbfa8949-aef4-4d80-8ece-7af18d74a9a0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sn5g8" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.726199 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b681f586-b4e0-4b2a-ab97-ea20583eeb34-srv-cert\") pod \"catalog-operator-68c6474976-bz5kc\" (UID: \"b681f586-b4e0-4b2a-ab97-ea20583eeb34\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bz5kc" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.726652 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d0de08e0-63c0-4a90-a264-1bc41b8746d8-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-79f62\" (UID: \"d0de08e0-63c0-4a90-a264-1bc41b8746d8\") " pod="openshift-marketplace/marketplace-operator-79b997595-79f62" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.726686 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/04421fdc-439e-4b78-b6ce-fcf8957ddf92-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-shwvb\" (UID: \"04421fdc-439e-4b78-b6ce-fcf8957ddf92\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-shwvb" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.726727 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ae1f2a5a-7833-46ff-b7e0-d8c2b70ec300-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-97zrr\" (UID: \"ae1f2a5a-7833-46ff-b7e0-d8c2b70ec300\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-97zrr" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.726749 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4rfn\" (UniqueName: \"kubernetes.io/projected/e3a102f6-2a75-4096-806a-7af5eca816e0-kube-api-access-b4rfn\") pod \"package-server-manager-789f6589d5-rkxpp\" (UID: \"e3a102f6-2a75-4096-806a-7af5eca816e0\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rkxpp" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.726819 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/19c1f910-f805-4151-8eb8-7a6628a62b5b-socket-dir\") pod \"csi-hostpathplugin-fbt6k\" (UID: \"19c1f910-f805-4151-8eb8-7a6628a62b5b\") " pod="hostpath-provisioner/csi-hostpathplugin-fbt6k" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.726846 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aed98b22-f91a-4aba-ab64-65fc09af1478-config\") pod \"service-ca-operator-777779d784-csv7z\" (UID: \"aed98b22-f91a-4aba-ab64-65fc09af1478\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-csv7z" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.726881 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aed98b22-f91a-4aba-ab64-65fc09af1478-serving-cert\") pod \"service-ca-operator-777779d784-csv7z\" (UID: \"aed98b22-f91a-4aba-ab64-65fc09af1478\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-csv7z" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.726903 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pk54\" (UniqueName: \"kubernetes.io/projected/f8136f49-df98-42ad-98e7-93ddb91c2063-kube-api-access-9pk54\") pod \"machine-config-server-z2fjr\" (UID: \"f8136f49-df98-42ad-98e7-93ddb91c2063\") " pod="openshift-machine-config-operator/machine-config-server-z2fjr" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.726941 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/133fe0e9-d231-45aa-bea8-6add237cffb4-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-265tl\" (UID: \"133fe0e9-d231-45aa-bea8-6add237cffb4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-265tl" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.726970 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/133fe0e9-d231-45aa-bea8-6add237cffb4-config\") pod \"kube-apiserver-operator-766d6c64bb-265tl\" (UID: \"133fe0e9-d231-45aa-bea8-6add237cffb4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-265tl" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.726992 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgw9l\" (UniqueName: \"kubernetes.io/projected/c45160a9-f0cb-4b39-ad29-67c14871973f-kube-api-access-wgw9l\") pod \"service-ca-9c57cc56f-f69tr\" (UID: \"c45160a9-f0cb-4b39-ad29-67c14871973f\") " pod="openshift-service-ca/service-ca-9c57cc56f-f69tr" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.727014 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04421fdc-439e-4b78-b6ce-fcf8957ddf92-config\") pod \"kube-controller-manager-operator-78b949d7b-shwvb\" (UID: \"04421fdc-439e-4b78-b6ce-fcf8957ddf92\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-shwvb" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.727036 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b44b8c72-3ca2-4fbe-aa3f-9ab7917b1658-service-ca-bundle\") pod \"router-default-5444994796-trk7h\" (UID: \"b44b8c72-3ca2-4fbe-aa3f-9ab7917b1658\") " pod="openshift-ingress/router-default-5444994796-trk7h" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.727061 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50b7cda3-dd1c-4644-b5a6-23957a406b19-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-z54pv\" (UID: \"50b7cda3-dd1c-4644-b5a6-23957a406b19\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-z54pv" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.727080 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/b44b8c72-3ca2-4fbe-aa3f-9ab7917b1658-stats-auth\") pod \"router-default-5444994796-trk7h\" (UID: \"b44b8c72-3ca2-4fbe-aa3f-9ab7917b1658\") " pod="openshift-ingress/router-default-5444994796-trk7h" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.727102 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f688bfd3-2a09-4640-8ec8-9b69cc9881c4-images\") pod \"machine-config-operator-74547568cd-w5v5m\" (UID: \"f688bfd3-2a09-4640-8ec8-9b69cc9881c4\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-w5v5m" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.727121 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d0de08e0-63c0-4a90-a264-1bc41b8746d8-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-79f62\" (UID: \"d0de08e0-63c0-4a90-a264-1bc41b8746d8\") " pod="openshift-marketplace/marketplace-operator-79b997595-79f62" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.727133 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdgfd\" (UniqueName: \"kubernetes.io/projected/108d72f5-0dd9-4965-a41f-7403ad8fce04-kube-api-access-mdgfd\") pod \"downloads-7954f5f757-z42jf\" (UID: \"108d72f5-0dd9-4965-a41f-7403ad8fce04\") " pod="openshift-console/downloads-7954f5f757-z42jf" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.727193 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e7922021-adba-44fd-aff2-2f0776f3fabe-trusted-ca\") pod \"ingress-operator-5b745b69d9-gnxh7\" (UID: \"e7922021-adba-44fd-aff2-2f0776f3fabe\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gnxh7" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.727216 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/bbfa8949-aef4-4d80-8ece-7af18d74a9a0-srv-cert\") pod \"olm-operator-6b444d44fb-sn5g8\" (UID: \"bbfa8949-aef4-4d80-8ece-7af18d74a9a0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sn5g8" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.727248 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f5c8edb8-fc4d-440e-94a0-116059aed6ad-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-jzvrd\" (UID: \"f5c8edb8-fc4d-440e-94a0-116059aed6ad\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jzvrd" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.727288 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvmx4\" (UniqueName: \"kubernetes.io/projected/d0de08e0-63c0-4a90-a264-1bc41b8746d8-kube-api-access-kvmx4\") pod \"marketplace-operator-79b997595-79f62\" (UID: \"d0de08e0-63c0-4a90-a264-1bc41b8746d8\") " pod="openshift-marketplace/marketplace-operator-79b997595-79f62" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.727307 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/19c1f910-f805-4151-8eb8-7a6628a62b5b-plugins-dir\") pod \"csi-hostpathplugin-fbt6k\" (UID: \"19c1f910-f805-4151-8eb8-7a6628a62b5b\") " pod="hostpath-provisioner/csi-hostpathplugin-fbt6k" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.728491 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b44b8c72-3ca2-4fbe-aa3f-9ab7917b1658-service-ca-bundle\") pod \"router-default-5444994796-trk7h\" (UID: \"b44b8c72-3ca2-4fbe-aa3f-9ab7917b1658\") " pod="openshift-ingress/router-default-5444994796-trk7h" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.728871 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/133fe0e9-d231-45aa-bea8-6add237cffb4-config\") pod \"kube-apiserver-operator-766d6c64bb-265tl\" (UID: \"133fe0e9-d231-45aa-bea8-6add237cffb4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-265tl" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.728900 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50b7cda3-dd1c-4644-b5a6-23957a406b19-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-z54pv\" (UID: \"50b7cda3-dd1c-4644-b5a6-23957a406b19\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-z54pv" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.729018 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04421fdc-439e-4b78-b6ce-fcf8957ddf92-config\") pod \"kube-controller-manager-operator-78b949d7b-shwvb\" (UID: \"04421fdc-439e-4b78-b6ce-fcf8957ddf92\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-shwvb" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.729213 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/19c1f910-f805-4151-8eb8-7a6628a62b5b-socket-dir\") pod \"csi-hostpathplugin-fbt6k\" (UID: \"19c1f910-f805-4151-8eb8-7a6628a62b5b\") " pod="hostpath-provisioner/csi-hostpathplugin-fbt6k" Feb 24 10:20:02 crc kubenswrapper[4698]: E0224 10:20:02.738178 4698 projected.go:288] Couldn't get configMap openshift-controller-manager/openshift-service-ca.crt: failed to sync configmap cache: timed out waiting for the condition Feb 24 10:20:02 crc kubenswrapper[4698]: E0224 10:20:02.738211 4698 projected.go:194] Error preparing data for projected volume kube-api-access-h46gj for pod openshift-controller-manager/controller-manager-879f6c89f-w8bmq: failed to sync configmap cache: timed out waiting for the condition Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.738961 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f5c8edb8-fc4d-440e-94a0-116059aed6ad-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-jzvrd\" (UID: \"f5c8edb8-fc4d-440e-94a0-116059aed6ad\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jzvrd" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.739430 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f688bfd3-2a09-4640-8ec8-9b69cc9881c4-images\") pod \"machine-config-operator-74547568cd-w5v5m\" (UID: \"f688bfd3-2a09-4640-8ec8-9b69cc9881c4\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-w5v5m" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.739699 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e7922021-adba-44fd-aff2-2f0776f3fabe-trusted-ca\") pod \"ingress-operator-5b745b69d9-gnxh7\" (UID: \"e7922021-adba-44fd-aff2-2f0776f3fabe\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gnxh7" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.741831 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b681f586-b4e0-4b2a-ab97-ea20583eeb34-profile-collector-cert\") pod \"catalog-operator-68c6474976-bz5kc\" (UID: \"b681f586-b4e0-4b2a-ab97-ea20583eeb34\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bz5kc" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.744536 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/19c1f910-f805-4151-8eb8-7a6628a62b5b-plugins-dir\") pod \"csi-hostpathplugin-fbt6k\" (UID: \"19c1f910-f805-4151-8eb8-7a6628a62b5b\") " pod="hostpath-provisioner/csi-hostpathplugin-fbt6k" Feb 24 10:20:02 crc kubenswrapper[4698]: E0224 10:20:02.744615 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/38bdf14d-35ac-440b-9a16-9a4ddd53df34-kube-api-access-h46gj podName:38bdf14d-35ac-440b-9a16-9a4ddd53df34 nodeName:}" failed. No retries permitted until 2026-02-24 10:20:03.238282558 +0000 UTC m=+228.351896799 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-h46gj" (UniqueName: "kubernetes.io/projected/38bdf14d-35ac-440b-9a16-9a4ddd53df34-kube-api-access-h46gj") pod "controller-manager-879f6c89f-w8bmq" (UID: "38bdf14d-35ac-440b-9a16-9a4ddd53df34") : failed to sync configmap cache: timed out waiting for the condition Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.744701 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjxkt\" (UniqueName: \"kubernetes.io/projected/0afb3450-a53f-4b58-9472-7bbec9f4eb54-kube-api-access-pjxkt\") pod \"migrator-59844c95c7-skflh\" (UID: \"0afb3450-a53f-4b58-9472-7bbec9f4eb54\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-skflh" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.744764 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/133fe0e9-d231-45aa-bea8-6add237cffb4-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-265tl\" (UID: \"133fe0e9-d231-45aa-bea8-6add237cffb4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-265tl" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.744810 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bmr2l\" (UID: \"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8\") " pod="openshift-image-registry/image-registry-697d97f7c8-bmr2l" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.744839 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bmbt\" (UniqueName: \"kubernetes.io/projected/b07a9333-815e-464f-afc6-28c1da857d84-kube-api-access-6bmbt\") pod \"cluster-samples-operator-665b6dd947-xcfhs\" (UID: \"b07a9333-815e-464f-afc6-28c1da857d84\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xcfhs" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.744879 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2pb2\" (UniqueName: \"kubernetes.io/projected/c230438c-2633-4e31-b0da-b1d037e35e0c-kube-api-access-l2pb2\") pod \"packageserver-d55dfcdfc-5tdjl\" (UID: \"c230438c-2633-4e31-b0da-b1d037e35e0c\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5tdjl" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.744907 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/19c1f910-f805-4151-8eb8-7a6628a62b5b-csi-data-dir\") pod \"csi-hostpathplugin-fbt6k\" (UID: \"19c1f910-f805-4151-8eb8-7a6628a62b5b\") " pod="hostpath-provisioner/csi-hostpathplugin-fbt6k" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.744934 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/97c3a4a8-9e33-4012-9b16-5a0de6e0ace9-config-volume\") pod \"collect-profiles-29532135-2wtjd\" (UID: \"97c3a4a8-9e33-4012-9b16-5a0de6e0ace9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29532135-2wtjd" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.744954 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/c45160a9-f0cb-4b39-ad29-67c14871973f-signing-cabundle\") pod \"service-ca-9c57cc56f-f69tr\" (UID: \"c45160a9-f0cb-4b39-ad29-67c14871973f\") " pod="openshift-service-ca/service-ca-9c57cc56f-f69tr" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.744978 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jp2q\" (UniqueName: \"kubernetes.io/projected/4a2469de-c5ab-4a39-9168-01e03bd4b1c6-kube-api-access-4jp2q\") pod \"dns-default-6h5bj\" (UID: \"4a2469de-c5ab-4a39-9168-01e03bd4b1c6\") " pod="openshift-dns/dns-default-6h5bj" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.745001 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhwhr\" (UniqueName: \"kubernetes.io/projected/29e7e4f9-c6e0-4a3a-8ec6-5c863c192667-kube-api-access-xhwhr\") pod \"multus-admission-controller-857f4d67dd-fd5xc\" (UID: \"29e7e4f9-c6e0-4a3a-8ec6-5c863c192667\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-fd5xc" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.745033 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vl72l\" (UniqueName: \"kubernetes.io/projected/f5c8edb8-fc4d-440e-94a0-116059aed6ad-kube-api-access-vl72l\") pod \"machine-config-controller-84d6567774-jzvrd\" (UID: \"f5c8edb8-fc4d-440e-94a0-116059aed6ad\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jzvrd" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.745060 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jl59\" (UniqueName: \"kubernetes.io/projected/b681f586-b4e0-4b2a-ab97-ea20583eeb34-kube-api-access-7jl59\") pod \"catalog-operator-68c6474976-bz5kc\" (UID: \"b681f586-b4e0-4b2a-ab97-ea20583eeb34\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bz5kc" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.745075 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/b44b8c72-3ca2-4fbe-aa3f-9ab7917b1658-stats-auth\") pod \"router-default-5444994796-trk7h\" (UID: \"b44b8c72-3ca2-4fbe-aa3f-9ab7917b1658\") " pod="openshift-ingress/router-default-5444994796-trk7h" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.745084 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e7922021-adba-44fd-aff2-2f0776f3fabe-metrics-tls\") pod \"ingress-operator-5b745b69d9-gnxh7\" (UID: \"e7922021-adba-44fd-aff2-2f0776f3fabe\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gnxh7" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.745163 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f5c8edb8-fc4d-440e-94a0-116059aed6ad-proxy-tls\") pod \"machine-config-controller-84d6567774-jzvrd\" (UID: \"f5c8edb8-fc4d-440e-94a0-116059aed6ad\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jzvrd" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.745189 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/50b7cda3-dd1c-4644-b5a6-23957a406b19-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-z54pv\" (UID: \"50b7cda3-dd1c-4644-b5a6-23957a406b19\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-z54pv" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.745214 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/04421fdc-439e-4b78-b6ce-fcf8957ddf92-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-shwvb\" (UID: \"04421fdc-439e-4b78-b6ce-fcf8957ddf92\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-shwvb" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.745282 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/875da7ec-7eeb-4f5c-b849-73863732ebb2-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-5582h\" (UID: \"875da7ec-7eeb-4f5c-b849-73863732ebb2\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5582h" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.745305 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/b07a9333-815e-464f-afc6-28c1da857d84-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-xcfhs\" (UID: \"b07a9333-815e-464f-afc6-28c1da857d84\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xcfhs" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.745330 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/c45160a9-f0cb-4b39-ad29-67c14871973f-signing-key\") pod \"service-ca-9c57cc56f-f69tr\" (UID: \"c45160a9-f0cb-4b39-ad29-67c14871973f\") " pod="openshift-service-ca/service-ca-9c57cc56f-f69tr" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.745356 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdt2j\" (UniqueName: \"kubernetes.io/projected/97c3a4a8-9e33-4012-9b16-5a0de6e0ace9-kube-api-access-wdt2j\") pod \"collect-profiles-29532135-2wtjd\" (UID: \"97c3a4a8-9e33-4012-9b16-5a0de6e0ace9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29532135-2wtjd" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.745411 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c230438c-2633-4e31-b0da-b1d037e35e0c-apiservice-cert\") pod \"packageserver-d55dfcdfc-5tdjl\" (UID: \"c230438c-2633-4e31-b0da-b1d037e35e0c\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5tdjl" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.745437 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/916d037d-f52e-449e-8496-34695060f8d5-cert\") pod \"ingress-canary-6kd89\" (UID: \"916d037d-f52e-449e-8496-34695060f8d5\") " pod="openshift-ingress-canary/ingress-canary-6kd89" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.745468 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f688bfd3-2a09-4640-8ec8-9b69cc9881c4-auth-proxy-config\") pod \"machine-config-operator-74547568cd-w5v5m\" (UID: \"f688bfd3-2a09-4640-8ec8-9b69cc9881c4\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-w5v5m" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.745487 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e7922021-adba-44fd-aff2-2f0776f3fabe-bound-sa-token\") pod \"ingress-operator-5b745b69d9-gnxh7\" (UID: \"e7922021-adba-44fd-aff2-2f0776f3fabe\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gnxh7" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.745507 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/e3a102f6-2a75-4096-806a-7af5eca816e0-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-rkxpp\" (UID: \"e3a102f6-2a75-4096-806a-7af5eca816e0\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rkxpp" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.745529 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4a2469de-c5ab-4a39-9168-01e03bd4b1c6-metrics-tls\") pod \"dns-default-6h5bj\" (UID: \"4a2469de-c5ab-4a39-9168-01e03bd4b1c6\") " pod="openshift-dns/dns-default-6h5bj" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.745549 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/29e7e4f9-c6e0-4a3a-8ec6-5c863c192667-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-fd5xc\" (UID: \"29e7e4f9-c6e0-4a3a-8ec6-5c863c192667\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-fd5xc" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.745591 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46p6r\" (UniqueName: \"kubernetes.io/projected/875da7ec-7eeb-4f5c-b849-73863732ebb2-kube-api-access-46p6r\") pod \"control-plane-machine-set-operator-78cbb6b69f-5582h\" (UID: \"875da7ec-7eeb-4f5c-b849-73863732ebb2\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5582h" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.745622 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c230438c-2633-4e31-b0da-b1d037e35e0c-webhook-cert\") pod \"packageserver-d55dfcdfc-5tdjl\" (UID: \"c230438c-2633-4e31-b0da-b1d037e35e0c\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5tdjl" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.745642 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/bbfa8949-aef4-4d80-8ece-7af18d74a9a0-profile-collector-cert\") pod \"olm-operator-6b444d44fb-sn5g8\" (UID: \"bbfa8949-aef4-4d80-8ece-7af18d74a9a0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sn5g8" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.745662 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6x7rs\" (UniqueName: \"kubernetes.io/projected/e7922021-adba-44fd-aff2-2f0776f3fabe-kube-api-access-6x7rs\") pod \"ingress-operator-5b745b69d9-gnxh7\" (UID: \"e7922021-adba-44fd-aff2-2f0776f3fabe\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gnxh7" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.745685 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggrxx\" (UniqueName: \"kubernetes.io/projected/aed98b22-f91a-4aba-ab64-65fc09af1478-kube-api-access-ggrxx\") pod \"service-ca-operator-777779d784-csv7z\" (UID: \"aed98b22-f91a-4aba-ab64-65fc09af1478\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-csv7z" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.745728 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sl7fn\" (UniqueName: \"kubernetes.io/projected/916d037d-f52e-449e-8496-34695060f8d5-kube-api-access-sl7fn\") pod \"ingress-canary-6kd89\" (UID: \"916d037d-f52e-449e-8496-34695060f8d5\") " pod="openshift-ingress-canary/ingress-canary-6kd89" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.745751 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/19c1f910-f805-4151-8eb8-7a6628a62b5b-registration-dir\") pod \"csi-hostpathplugin-fbt6k\" (UID: \"19c1f910-f805-4151-8eb8-7a6628a62b5b\") " pod="hostpath-provisioner/csi-hostpathplugin-fbt6k" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.745775 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/97c3a4a8-9e33-4012-9b16-5a0de6e0ace9-secret-volume\") pod \"collect-profiles-29532135-2wtjd\" (UID: \"97c3a4a8-9e33-4012-9b16-5a0de6e0ace9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29532135-2wtjd" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.745795 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/c230438c-2633-4e31-b0da-b1d037e35e0c-tmpfs\") pod \"packageserver-d55dfcdfc-5tdjl\" (UID: \"c230438c-2633-4e31-b0da-b1d037e35e0c\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5tdjl" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.745814 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/b44b8c72-3ca2-4fbe-aa3f-9ab7917b1658-default-certificate\") pod \"router-default-5444994796-trk7h\" (UID: \"b44b8c72-3ca2-4fbe-aa3f-9ab7917b1658\") " pod="openshift-ingress/router-default-5444994796-trk7h" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.745866 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/f8136f49-df98-42ad-98e7-93ddb91c2063-node-bootstrap-token\") pod \"machine-config-server-z2fjr\" (UID: \"f8136f49-df98-42ad-98e7-93ddb91c2063\") " pod="openshift-machine-config-operator/machine-config-server-z2fjr" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.745892 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/f8136f49-df98-42ad-98e7-93ddb91c2063-certs\") pod \"machine-config-server-z2fjr\" (UID: \"f8136f49-df98-42ad-98e7-93ddb91c2063\") " pod="openshift-machine-config-operator/machine-config-server-z2fjr" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.745915 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4a2469de-c5ab-4a39-9168-01e03bd4b1c6-config-volume\") pod \"dns-default-6h5bj\" (UID: \"4a2469de-c5ab-4a39-9168-01e03bd4b1c6\") " pod="openshift-dns/dns-default-6h5bj" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.745932 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmnnk\" (UniqueName: \"kubernetes.io/projected/b44b8c72-3ca2-4fbe-aa3f-9ab7917b1658-kube-api-access-mmnnk\") pod \"router-default-5444994796-trk7h\" (UID: \"b44b8c72-3ca2-4fbe-aa3f-9ab7917b1658\") " pod="openshift-ingress/router-default-5444994796-trk7h" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.745956 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/19c1f910-f805-4151-8eb8-7a6628a62b5b-mountpoint-dir\") pod \"csi-hostpathplugin-fbt6k\" (UID: \"19c1f910-f805-4151-8eb8-7a6628a62b5b\") " pod="hostpath-provisioner/csi-hostpathplugin-fbt6k" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.745989 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b44b8c72-3ca2-4fbe-aa3f-9ab7917b1658-metrics-certs\") pod \"router-default-5444994796-trk7h\" (UID: \"b44b8c72-3ca2-4fbe-aa3f-9ab7917b1658\") " pod="openshift-ingress/router-default-5444994796-trk7h" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.746010 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9kdb\" (UniqueName: \"kubernetes.io/projected/19c1f910-f805-4151-8eb8-7a6628a62b5b-kube-api-access-w9kdb\") pod \"csi-hostpathplugin-fbt6k\" (UID: \"19c1f910-f805-4151-8eb8-7a6628a62b5b\") " pod="hostpath-provisioner/csi-hostpathplugin-fbt6k" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.746031 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzjzw\" (UniqueName: \"kubernetes.io/projected/f688bfd3-2a09-4640-8ec8-9b69cc9881c4-kube-api-access-hzjzw\") pod \"machine-config-operator-74547568cd-w5v5m\" (UID: \"f688bfd3-2a09-4640-8ec8-9b69cc9881c4\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-w5v5m" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.746055 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rbbl\" (UniqueName: \"kubernetes.io/projected/50b7cda3-dd1c-4644-b5a6-23957a406b19-kube-api-access-4rbbl\") pod \"kube-storage-version-migrator-operator-b67b599dd-z54pv\" (UID: \"50b7cda3-dd1c-4644-b5a6-23957a406b19\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-z54pv" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.746077 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f688bfd3-2a09-4640-8ec8-9b69cc9881c4-proxy-tls\") pod \"machine-config-operator-74547568cd-w5v5m\" (UID: \"f688bfd3-2a09-4640-8ec8-9b69cc9881c4\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-w5v5m" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.746095 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae1f2a5a-7833-46ff-b7e0-d8c2b70ec300-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-97zrr\" (UID: \"ae1f2a5a-7833-46ff-b7e0-d8c2b70ec300\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-97zrr" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.746754 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae1f2a5a-7833-46ff-b7e0-d8c2b70ec300-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-97zrr\" (UID: \"ae1f2a5a-7833-46ff-b7e0-d8c2b70ec300\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-97zrr" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.747506 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b681f586-b4e0-4b2a-ab97-ea20583eeb34-srv-cert\") pod \"catalog-operator-68c6474976-bz5kc\" (UID: \"b681f586-b4e0-4b2a-ab97-ea20583eeb34\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bz5kc" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.747567 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ae1f2a5a-7833-46ff-b7e0-d8c2b70ec300-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-97zrr\" (UID: \"ae1f2a5a-7833-46ff-b7e0-d8c2b70ec300\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-97zrr" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.747582 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aed98b22-f91a-4aba-ab64-65fc09af1478-config\") pod \"service-ca-operator-777779d784-csv7z\" (UID: \"aed98b22-f91a-4aba-ab64-65fc09af1478\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-csv7z" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.747668 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/19c1f910-f805-4151-8eb8-7a6628a62b5b-csi-data-dir\") pod \"csi-hostpathplugin-fbt6k\" (UID: \"19c1f910-f805-4151-8eb8-7a6628a62b5b\") " pod="hostpath-provisioner/csi-hostpathplugin-fbt6k" Feb 24 10:20:02 crc kubenswrapper[4698]: E0224 10:20:02.748067 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:20:03.248053125 +0000 UTC m=+228.361667366 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bmr2l" (UID: "9ded6944-ff06-4cd5-beef-4dbb3cb9aba8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.748128 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e7922021-adba-44fd-aff2-2f0776f3fabe-metrics-tls\") pod \"ingress-operator-5b745b69d9-gnxh7\" (UID: \"e7922021-adba-44fd-aff2-2f0776f3fabe\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gnxh7" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.748809 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/50b7cda3-dd1c-4644-b5a6-23957a406b19-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-z54pv\" (UID: \"50b7cda3-dd1c-4644-b5a6-23957a406b19\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-z54pv" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.748829 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/19c1f910-f805-4151-8eb8-7a6628a62b5b-registration-dir\") pod \"csi-hostpathplugin-fbt6k\" (UID: \"19c1f910-f805-4151-8eb8-7a6628a62b5b\") " pod="hostpath-provisioner/csi-hostpathplugin-fbt6k" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.749307 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/97c3a4a8-9e33-4012-9b16-5a0de6e0ace9-config-volume\") pod \"collect-profiles-29532135-2wtjd\" (UID: \"97c3a4a8-9e33-4012-9b16-5a0de6e0ace9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29532135-2wtjd" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.752944 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4a2469de-c5ab-4a39-9168-01e03bd4b1c6-config-volume\") pod \"dns-default-6h5bj\" (UID: \"4a2469de-c5ab-4a39-9168-01e03bd4b1c6\") " pod="openshift-dns/dns-default-6h5bj" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.753408 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/04421fdc-439e-4b78-b6ce-fcf8957ddf92-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-shwvb\" (UID: \"04421fdc-439e-4b78-b6ce-fcf8957ddf92\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-shwvb" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.753813 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/c45160a9-f0cb-4b39-ad29-67c14871973f-signing-cabundle\") pod \"service-ca-9c57cc56f-f69tr\" (UID: \"c45160a9-f0cb-4b39-ad29-67c14871973f\") " pod="openshift-service-ca/service-ca-9c57cc56f-f69tr" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.753812 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aed98b22-f91a-4aba-ab64-65fc09af1478-serving-cert\") pod \"service-ca-operator-777779d784-csv7z\" (UID: \"aed98b22-f91a-4aba-ab64-65fc09af1478\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-csv7z" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.754574 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d0de08e0-63c0-4a90-a264-1bc41b8746d8-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-79f62\" (UID: \"d0de08e0-63c0-4a90-a264-1bc41b8746d8\") " pod="openshift-marketplace/marketplace-operator-79b997595-79f62" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.755340 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/b07a9333-815e-464f-afc6-28c1da857d84-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-xcfhs\" (UID: \"b07a9333-815e-464f-afc6-28c1da857d84\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xcfhs" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.755537 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/97c3a4a8-9e33-4012-9b16-5a0de6e0ace9-secret-volume\") pod \"collect-profiles-29532135-2wtjd\" (UID: \"97c3a4a8-9e33-4012-9b16-5a0de6e0ace9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29532135-2wtjd" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.755948 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/133fe0e9-d231-45aa-bea8-6add237cffb4-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-265tl\" (UID: \"133fe0e9-d231-45aa-bea8-6add237cffb4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-265tl" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.756008 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/875da7ec-7eeb-4f5c-b849-73863732ebb2-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-5582h\" (UID: \"875da7ec-7eeb-4f5c-b849-73863732ebb2\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5582h" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.758231 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.758997 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/f8136f49-df98-42ad-98e7-93ddb91c2063-certs\") pod \"machine-config-server-z2fjr\" (UID: \"f8136f49-df98-42ad-98e7-93ddb91c2063\") " pod="openshift-machine-config-operator/machine-config-server-z2fjr" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.759332 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dkf4t" event={"ID":"21ede3a0-b26f-4e29-8a13-86d877b60519","Type":"ContainerStarted","Data":"3b0a5c591c31e0440a1cadc857e183bcef2d6f5dbc80f68c658fca10e25aa9e9"} Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.759465 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/c45160a9-f0cb-4b39-ad29-67c14871973f-signing-key\") pod \"service-ca-9c57cc56f-f69tr\" (UID: \"c45160a9-f0cb-4b39-ad29-67c14871973f\") " pod="openshift-service-ca/service-ca-9c57cc56f-f69tr" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.760876 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7141f48f-7462-4e1a-a90f-b8ff3b9a8d9f-config\") pod \"route-controller-manager-6576b87f9c-bfmpl\" (UID: \"7141f48f-7462-4e1a-a90f-b8ff3b9a8d9f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bfmpl" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.761328 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/bbfa8949-aef4-4d80-8ece-7af18d74a9a0-srv-cert\") pod \"olm-operator-6b444d44fb-sn5g8\" (UID: \"bbfa8949-aef4-4d80-8ece-7af18d74a9a0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sn5g8" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.761489 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6bh9j" event={"ID":"a09fb76f-4291-4945-8bb0-15c478a35cbf","Type":"ContainerStarted","Data":"7fd8d6d721f3225bddb7ccede14bfbe180f7c2655122883e35de7097f4548fae"} Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.762752 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4a2469de-c5ab-4a39-9168-01e03bd4b1c6-metrics-tls\") pod \"dns-default-6h5bj\" (UID: \"4a2469de-c5ab-4a39-9168-01e03bd4b1c6\") " pod="openshift-dns/dns-default-6h5bj" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.764022 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.764059 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/e3a102f6-2a75-4096-806a-7af5eca816e0-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-rkxpp\" (UID: \"e3a102f6-2a75-4096-806a-7af5eca816e0\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rkxpp" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.765504 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/f8136f49-df98-42ad-98e7-93ddb91c2063-node-bootstrap-token\") pod \"machine-config-server-z2fjr\" (UID: \"f8136f49-df98-42ad-98e7-93ddb91c2063\") " pod="openshift-machine-config-operator/machine-config-server-z2fjr" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.770713 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f5c8edb8-fc4d-440e-94a0-116059aed6ad-proxy-tls\") pod \"machine-config-controller-84d6567774-jzvrd\" (UID: \"f5c8edb8-fc4d-440e-94a0-116059aed6ad\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jzvrd" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.770948 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/19c1f910-f805-4151-8eb8-7a6628a62b5b-mountpoint-dir\") pod \"csi-hostpathplugin-fbt6k\" (UID: \"19c1f910-f805-4151-8eb8-7a6628a62b5b\") " pod="hostpath-provisioner/csi-hostpathplugin-fbt6k" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.770959 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/f688bfd3-2a09-4640-8ec8-9b69cc9881c4-auth-proxy-config\") pod \"machine-config-operator-74547568cd-w5v5m\" (UID: \"f688bfd3-2a09-4640-8ec8-9b69cc9881c4\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-w5v5m" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.771109 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/b44b8c72-3ca2-4fbe-aa3f-9ab7917b1658-default-certificate\") pod \"router-default-5444994796-trk7h\" (UID: \"b44b8c72-3ca2-4fbe-aa3f-9ab7917b1658\") " pod="openshift-ingress/router-default-5444994796-trk7h" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.771740 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/c230438c-2633-4e31-b0da-b1d037e35e0c-tmpfs\") pod \"packageserver-d55dfcdfc-5tdjl\" (UID: \"c230438c-2633-4e31-b0da-b1d037e35e0c\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5tdjl" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.771981 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7141f48f-7462-4e1a-a90f-b8ff3b9a8d9f-client-ca\") pod \"route-controller-manager-6576b87f9c-bfmpl\" (UID: \"7141f48f-7462-4e1a-a90f-b8ff3b9a8d9f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bfmpl" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.772640 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/916d037d-f52e-449e-8496-34695060f8d5-cert\") pod \"ingress-canary-6kd89\" (UID: \"916d037d-f52e-449e-8496-34695060f8d5\") " pod="openshift-ingress-canary/ingress-canary-6kd89" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.775795 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c230438c-2633-4e31-b0da-b1d037e35e0c-webhook-cert\") pod \"packageserver-d55dfcdfc-5tdjl\" (UID: \"c230438c-2633-4e31-b0da-b1d037e35e0c\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5tdjl" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.777500 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/bbfa8949-aef4-4d80-8ece-7af18d74a9a0-profile-collector-cert\") pod \"olm-operator-6b444d44fb-sn5g8\" (UID: \"bbfa8949-aef4-4d80-8ece-7af18d74a9a0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sn5g8" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.777876 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f688bfd3-2a09-4640-8ec8-9b69cc9881c4-proxy-tls\") pod \"machine-config-operator-74547568cd-w5v5m\" (UID: \"f688bfd3-2a09-4640-8ec8-9b69cc9881c4\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-w5v5m" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.778568 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/29e7e4f9-c6e0-4a3a-8ec6-5c863c192667-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-fd5xc\" (UID: \"29e7e4f9-c6e0-4a3a-8ec6-5c863c192667\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-fd5xc" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.781828 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b44b8c72-3ca2-4fbe-aa3f-9ab7917b1658-metrics-certs\") pod \"router-default-5444994796-trk7h\" (UID: \"b44b8c72-3ca2-4fbe-aa3f-9ab7917b1658\") " pod="openshift-ingress/router-default-5444994796-trk7h" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.784964 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c230438c-2633-4e31-b0da-b1d037e35e0c-apiservice-cert\") pod \"packageserver-d55dfcdfc-5tdjl\" (UID: \"c230438c-2633-4e31-b0da-b1d037e35e0c\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5tdjl" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.788807 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.805239 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.825183 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 24 10:20:02 crc kubenswrapper[4698]: E0224 10:20:02.831366 4698 projected.go:194] Error preparing data for projected volume kube-api-access-ptvs7 for pod openshift-machine-api/machine-api-operator-5694c8668f-vb7wk: failed to sync configmap cache: timed out waiting for the condition Feb 24 10:20:02 crc kubenswrapper[4698]: E0224 10:20:02.831457 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fccccb67-888f-4a34-a701-61926e9819a6-kube-api-access-ptvs7 podName:fccccb67-888f-4a34-a701-61926e9819a6 nodeName:}" failed. No retries permitted until 2026-02-24 10:20:03.331435773 +0000 UTC m=+228.445050014 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-ptvs7" (UniqueName: "kubernetes.io/projected/fccccb67-888f-4a34-a701-61926e9819a6-kube-api-access-ptvs7") pod "machine-api-operator-5694c8668f-vb7wk" (UID: "fccccb67-888f-4a34-a701-61926e9819a6") : failed to sync configmap cache: timed out waiting for the condition Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.837443 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-5ndrw"] Feb 24 10:20:02 crc kubenswrapper[4698]: W0224 10:20:02.840246 4698 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1539b772_1d04_4bc9_85f1_99a99b5d237d.slice/crio-878f58d88f95e12203b17cbcac773847b9e0410fb6726b3792bd8743cdce2e05 WatchSource:0}: Error finding container 878f58d88f95e12203b17cbcac773847b9e0410fb6726b3792bd8743cdce2e05: Status 404 returned error can't find the container with id 878f58d88f95e12203b17cbcac773847b9e0410fb6726b3792bd8743cdce2e05 Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.844417 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.846490 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:20:02 crc kubenswrapper[4698]: E0224 10:20:02.846663 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:20:03.346644886 +0000 UTC m=+228.460259127 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.846758 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bmr2l\" (UID: \"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8\") " pod="openshift-image-registry/image-registry-697d97f7c8-bmr2l" Feb 24 10:20:02 crc kubenswrapper[4698]: E0224 10:20:02.847545 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:20:03.347522556 +0000 UTC m=+228.461136797 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bmr2l" (UID: "9ded6944-ff06-4cd5-beef-4dbb3cb9aba8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.854027 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7141f48f-7462-4e1a-a90f-b8ff3b9a8d9f-serving-cert\") pod \"route-controller-manager-6576b87f9c-bfmpl\" (UID: \"7141f48f-7462-4e1a-a90f-b8ff3b9a8d9f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bfmpl" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.864858 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.872939 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/59c9844c-00fe-42cd-add6-9ab528da273d-service-ca-bundle\") pod \"authentication-operator-69f744f599-fq25r\" (UID: \"59c9844c-00fe-42cd-add6-9ab528da273d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fq25r" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.884541 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.890994 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59c9844c-00fe-42cd-add6-9ab528da273d-config\") pod \"authentication-operator-69f744f599-fq25r\" (UID: \"59c9844c-00fe-42cd-add6-9ab528da273d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fq25r" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.905276 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.924635 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.945067 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.949113 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:20:02 crc kubenswrapper[4698]: E0224 10:20:02.949227 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:20:03.449197609 +0000 UTC m=+228.562811850 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.949499 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bmr2l\" (UID: \"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8\") " pod="openshift-image-registry/image-registry-697d97f7c8-bmr2l" Feb 24 10:20:02 crc kubenswrapper[4698]: E0224 10:20:02.949853 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:20:03.449842094 +0000 UTC m=+228.563456405 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bmr2l" (UID: "9ded6944-ff06-4cd5-beef-4dbb3cb9aba8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.952910 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38bdf14d-35ac-440b-9a16-9a4ddd53df34-config\") pod \"controller-manager-879f6c89f-w8bmq\" (UID: \"38bdf14d-35ac-440b-9a16-9a4ddd53df34\") " pod="openshift-controller-manager/controller-manager-879f6c89f-w8bmq" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.964595 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 24 10:20:02 crc kubenswrapper[4698]: I0224 10:20:02.985301 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 24 10:20:03 crc kubenswrapper[4698]: E0224 10:20:03.009725 4698 configmap.go:193] Couldn't get configMap openshift-authentication-operator/trusted-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Feb 24 10:20:03 crc kubenswrapper[4698]: E0224 10:20:03.009802 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/59c9844c-00fe-42cd-add6-9ab528da273d-trusted-ca-bundle podName:59c9844c-00fe-42cd-add6-9ab528da273d nodeName:}" failed. No retries permitted until 2026-02-24 10:20:04.009783526 +0000 UTC m=+229.123397767 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/59c9844c-00fe-42cd-add6-9ab528da273d-trusted-ca-bundle") pod "authentication-operator-69f744f599-fq25r" (UID: "59c9844c-00fe-42cd-add6-9ab528da273d") : failed to sync configmap cache: timed out waiting for the condition Feb 24 10:20:03 crc kubenswrapper[4698]: E0224 10:20:03.009828 4698 secret.go:188] Couldn't get secret openshift-controller-manager/serving-cert: failed to sync secret cache: timed out waiting for the condition Feb 24 10:20:03 crc kubenswrapper[4698]: E0224 10:20:03.009887 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38bdf14d-35ac-440b-9a16-9a4ddd53df34-serving-cert podName:38bdf14d-35ac-440b-9a16-9a4ddd53df34 nodeName:}" failed. No retries permitted until 2026-02-24 10:20:04.009870038 +0000 UTC m=+229.123484279 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/38bdf14d-35ac-440b-9a16-9a4ddd53df34-serving-cert") pod "controller-manager-879f6c89f-w8bmq" (UID: "38bdf14d-35ac-440b-9a16-9a4ddd53df34") : failed to sync secret cache: timed out waiting for the condition Feb 24 10:20:03 crc kubenswrapper[4698]: E0224 10:20:03.010083 4698 configmap.go:193] Couldn't get configMap openshift-controller-manager/openshift-global-ca: failed to sync configmap cache: timed out waiting for the condition Feb 24 10:20:03 crc kubenswrapper[4698]: E0224 10:20:03.010117 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/38bdf14d-35ac-440b-9a16-9a4ddd53df34-proxy-ca-bundles podName:38bdf14d-35ac-440b-9a16-9a4ddd53df34 nodeName:}" failed. No retries permitted until 2026-02-24 10:20:04.010110124 +0000 UTC m=+229.123724365 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "proxy-ca-bundles" (UniqueName: "kubernetes.io/configmap/38bdf14d-35ac-440b-9a16-9a4ddd53df34-proxy-ca-bundles") pod "controller-manager-879f6c89f-w8bmq" (UID: "38bdf14d-35ac-440b-9a16-9a4ddd53df34") : failed to sync configmap cache: timed out waiting for the condition Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.011672 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.031836 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.044795 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.050586 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:20:03 crc kubenswrapper[4698]: E0224 10:20:03.050738 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:20:03.550716068 +0000 UTC m=+228.664330309 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.050854 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bmr2l\" (UID: \"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8\") " pod="openshift-image-registry/image-registry-697d97f7c8-bmr2l" Feb 24 10:20:03 crc kubenswrapper[4698]: E0224 10:20:03.051198 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:20:03.551187778 +0000 UTC m=+228.664802019 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bmr2l" (UID: "9ded6944-ff06-4cd5-beef-4dbb3cb9aba8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.101088 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74k5q\" (UniqueName: \"kubernetes.io/projected/50f7a0ea-7b15-487b-b907-6fb4c7451eed-kube-api-access-74k5q\") pod \"openshift-config-operator-7777fb866f-lcjcd\" (UID: \"50f7a0ea-7b15-487b-b907-6fb4c7451eed\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-lcjcd" Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.119471 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkh9l\" (UniqueName: \"kubernetes.io/projected/803e0d1c-f298-49b4-9251-9271f311ee92-kube-api-access-wkh9l\") pod \"oauth-openshift-558db77b4-hxxxs\" (UID: \"803e0d1c-f298-49b4-9251-9271f311ee92\") " pod="openshift-authentication/oauth-openshift-558db77b4-hxxxs" Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.138692 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbzx5\" (UniqueName: \"kubernetes.io/projected/effafc66-9dae-4ef3-86a5-72e1fac84fc4-kube-api-access-gbzx5\") pod \"openshift-apiserver-operator-796bbdcf4f-jrgwq\" (UID: \"effafc66-9dae-4ef3-86a5-72e1fac84fc4\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jrgwq" Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.154408 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:20:03 crc kubenswrapper[4698]: E0224 10:20:03.154569 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:20:03.65454131 +0000 UTC m=+228.768155551 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.154751 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kb89h\" (UniqueName: \"kubernetes.io/projected/7141f48f-7462-4e1a-a90f-b8ff3b9a8d9f-kube-api-access-kb89h\") pod \"route-controller-manager-6576b87f9c-bfmpl\" (UID: \"7141f48f-7462-4e1a-a90f-b8ff3b9a8d9f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bfmpl" Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.155082 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bmr2l\" (UID: \"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8\") " pod="openshift-image-registry/image-registry-697d97f7c8-bmr2l" Feb 24 10:20:03 crc kubenswrapper[4698]: E0224 10:20:03.156077 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:20:03.656063855 +0000 UTC m=+228.769678096 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bmr2l" (UID: "9ded6944-ff06-4cd5-beef-4dbb3cb9aba8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.158351 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kb89h\" (UniqueName: \"kubernetes.io/projected/7141f48f-7462-4e1a-a90f-b8ff3b9a8d9f-kube-api-access-kb89h\") pod \"route-controller-manager-6576b87f9c-bfmpl\" (UID: \"7141f48f-7462-4e1a-a90f-b8ff3b9a8d9f\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bfmpl" Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.158701 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlwtm\" (UniqueName: \"kubernetes.io/projected/9c9967ed-20af-48cf-859d-4c3060d413fb-kube-api-access-nlwtm\") pod \"console-operator-58897d9998-qzmkf\" (UID: \"9c9967ed-20af-48cf-859d-4c3060d413fb\") " pod="openshift-console-operator/console-operator-58897d9998-qzmkf" Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.182798 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wzkz\" (UniqueName: \"kubernetes.io/projected/9ded6944-ff06-4cd5-beef-4dbb3cb9aba8-kube-api-access-5wzkz\") pod \"image-registry-697d97f7c8-bmr2l\" (UID: \"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8\") " pod="openshift-image-registry/image-registry-697d97f7c8-bmr2l" Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.183065 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bfmpl" Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.199044 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jrgwq" Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.205178 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kgzn\" (UniqueName: \"kubernetes.io/projected/312007fb-fd23-4a36-b653-ea3e24a02ee0-kube-api-access-9kgzn\") pod \"dns-operator-744455d44c-v6qbj\" (UID: \"312007fb-fd23-4a36-b653-ea3e24a02ee0\") " pod="openshift-dns-operator/dns-operator-744455d44c-v6qbj" Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.210010 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-v6qbj" Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.222837 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6b1ad070-5898-4b3a-ab57-57d781c9b809-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-x8tds\" (UID: \"6b1ad070-5898-4b3a-ab57-57d781c9b809\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-x8tds" Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.236717 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-qzmkf" Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.240175 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kr8mq\" (UniqueName: \"kubernetes.io/projected/6b1ad070-5898-4b3a-ab57-57d781c9b809-kube-api-access-kr8mq\") pod \"cluster-image-registry-operator-dc59b4c8b-x8tds\" (UID: \"6b1ad070-5898-4b3a-ab57-57d781c9b809\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-x8tds" Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.257077 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:20:03 crc kubenswrapper[4698]: E0224 10:20:03.257345 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:20:03.757323487 +0000 UTC m=+228.870937748 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.257421 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bmr2l\" (UID: \"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8\") " pod="openshift-image-registry/image-registry-697d97f7c8-bmr2l" Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.257560 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zk9sk\" (UniqueName: \"kubernetes.io/projected/59c9844c-00fe-42cd-add6-9ab528da273d-kube-api-access-zk9sk\") pod \"authentication-operator-69f744f599-fq25r\" (UID: \"59c9844c-00fe-42cd-add6-9ab528da273d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fq25r" Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.257636 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h46gj\" (UniqueName: \"kubernetes.io/projected/38bdf14d-35ac-440b-9a16-9a4ddd53df34-kube-api-access-h46gj\") pod \"controller-manager-879f6c89f-w8bmq\" (UID: \"38bdf14d-35ac-440b-9a16-9a4ddd53df34\") " pod="openshift-controller-manager/controller-manager-879f6c89f-w8bmq" Feb 24 10:20:03 crc kubenswrapper[4698]: E0224 10:20:03.258068 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:20:03.758053265 +0000 UTC m=+228.871667506 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bmr2l" (UID: "9ded6944-ff06-4cd5-beef-4dbb3cb9aba8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.261556 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-x8tds" Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.262535 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h46gj\" (UniqueName: \"kubernetes.io/projected/38bdf14d-35ac-440b-9a16-9a4ddd53df34-kube-api-access-h46gj\") pod \"controller-manager-879f6c89f-w8bmq\" (UID: \"38bdf14d-35ac-440b-9a16-9a4ddd53df34\") " pod="openshift-controller-manager/controller-manager-879f6c89f-w8bmq" Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.262539 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zk9sk\" (UniqueName: \"kubernetes.io/projected/59c9844c-00fe-42cd-add6-9ab528da273d-kube-api-access-zk9sk\") pod \"authentication-operator-69f744f599-fq25r\" (UID: \"59c9844c-00fe-42cd-add6-9ab528da273d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fq25r" Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.265359 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-lcjcd" Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.268341 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qxvw\" (UniqueName: \"kubernetes.io/projected/e6dc6e77-8617-4bc0-8960-6b81b87c8b88-kube-api-access-6qxvw\") pod \"openshift-controller-manager-operator-756b6f6bc6-snxxh\" (UID: \"e6dc6e77-8617-4bc0-8960-6b81b87c8b88\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-snxxh" Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.272523 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-snxxh" Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.287881 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9ded6944-ff06-4cd5-beef-4dbb3cb9aba8-bound-sa-token\") pod \"image-registry-697d97f7c8-bmr2l\" (UID: \"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8\") " pod="openshift-image-registry/image-registry-697d97f7c8-bmr2l" Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.328572 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jc8z5\" (UniqueName: \"kubernetes.io/projected/348d0b48-f2a9-4326-b8c8-88f43029f382-kube-api-access-jc8z5\") pod \"console-f9d7485db-clwmh\" (UID: \"348d0b48-f2a9-4326-b8c8-88f43029f382\") " pod="openshift-console/console-f9d7485db-clwmh" Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.334458 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hk7pb\" (UniqueName: \"kubernetes.io/projected/568f96c6-6a68-4e06-a1e1-1b787f58bac7-kube-api-access-hk7pb\") pod \"apiserver-76f77b778f-lqzfp\" (UID: \"568f96c6-6a68-4e06-a1e1-1b787f58bac7\") " pod="openshift-apiserver/apiserver-76f77b778f-lqzfp" Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.344791 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xxpc\" (UniqueName: \"kubernetes.io/projected/bbfa8949-aef4-4d80-8ece-7af18d74a9a0-kube-api-access-6xxpc\") pod \"olm-operator-6b444d44fb-sn5g8\" (UID: \"bbfa8949-aef4-4d80-8ece-7af18d74a9a0\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sn5g8" Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.358628 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.359050 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptvs7\" (UniqueName: \"kubernetes.io/projected/fccccb67-888f-4a34-a701-61926e9819a6-kube-api-access-ptvs7\") pod \"machine-api-operator-5694c8668f-vb7wk\" (UID: \"fccccb67-888f-4a34-a701-61926e9819a6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vb7wk" Feb 24 10:20:03 crc kubenswrapper[4698]: E0224 10:20:03.359521 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:20:03.859491051 +0000 UTC m=+228.973105302 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.361791 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdgfd\" (UniqueName: \"kubernetes.io/projected/108d72f5-0dd9-4965-a41f-7403ad8fce04-kube-api-access-mdgfd\") pod \"downloads-7954f5f757-z42jf\" (UID: \"108d72f5-0dd9-4965-a41f-7403ad8fce04\") " pod="openshift-console/downloads-7954f5f757-z42jf" Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.364304 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptvs7\" (UniqueName: \"kubernetes.io/projected/fccccb67-888f-4a34-a701-61926e9819a6-kube-api-access-ptvs7\") pod \"machine-api-operator-5694c8668f-vb7wk\" (UID: \"fccccb67-888f-4a34-a701-61926e9819a6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-vb7wk" Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.387638 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgw9l\" (UniqueName: \"kubernetes.io/projected/c45160a9-f0cb-4b39-ad29-67c14871973f-kube-api-access-wgw9l\") pod \"service-ca-9c57cc56f-f69tr\" (UID: \"c45160a9-f0cb-4b39-ad29-67c14871973f\") " pod="openshift-service-ca/service-ca-9c57cc56f-f69tr" Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.410658 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-hxxxs" Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.413868 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ae1f2a5a-7833-46ff-b7e0-d8c2b70ec300-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-97zrr\" (UID: \"ae1f2a5a-7833-46ff-b7e0-d8c2b70ec300\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-97zrr" Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.418005 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvmx4\" (UniqueName: \"kubernetes.io/projected/d0de08e0-63c0-4a90-a264-1bc41b8746d8-kube-api-access-kvmx4\") pod \"marketplace-operator-79b997595-79f62\" (UID: \"d0de08e0-63c0-4a90-a264-1bc41b8746d8\") " pod="openshift-marketplace/marketplace-operator-79b997595-79f62" Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.426707 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sn5g8" Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.443120 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-97zrr" Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.447035 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4rfn\" (UniqueName: \"kubernetes.io/projected/e3a102f6-2a75-4096-806a-7af5eca816e0-kube-api-access-b4rfn\") pod \"package-server-manager-789f6589d5-rkxpp\" (UID: \"e3a102f6-2a75-4096-806a-7af5eca816e0\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rkxpp" Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.457748 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rkxpp" Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.462948 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bmr2l\" (UID: \"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8\") " pod="openshift-image-registry/image-registry-697d97f7c8-bmr2l" Feb 24 10:20:03 crc kubenswrapper[4698]: E0224 10:20:03.463372 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:20:03.963359164 +0000 UTC m=+229.076973405 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bmr2l" (UID: "9ded6944-ff06-4cd5-beef-4dbb3cb9aba8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.466674 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pk54\" (UniqueName: \"kubernetes.io/projected/f8136f49-df98-42ad-98e7-93ddb91c2063-kube-api-access-9pk54\") pod \"machine-config-server-z2fjr\" (UID: \"f8136f49-df98-42ad-98e7-93ddb91c2063\") " pod="openshift-machine-config-operator/machine-config-server-z2fjr" Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.489090 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/04421fdc-439e-4b78-b6ce-fcf8957ddf92-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-shwvb\" (UID: \"04421fdc-439e-4b78-b6ce-fcf8957ddf92\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-shwvb" Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.504072 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjxkt\" (UniqueName: \"kubernetes.io/projected/0afb3450-a53f-4b58-9472-7bbec9f4eb54-kube-api-access-pjxkt\") pod \"migrator-59844c95c7-skflh\" (UID: \"0afb3450-a53f-4b58-9472-7bbec9f4eb54\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-skflh" Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.519092 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-lqzfp" Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.526619 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-lcjcd"] Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.527213 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/133fe0e9-d231-45aa-bea8-6add237cffb4-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-265tl\" (UID: \"133fe0e9-d231-45aa-bea8-6add237cffb4\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-265tl" Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.531839 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-clwmh" Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.536191 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-v6qbj"] Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.543953 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e7922021-adba-44fd-aff2-2f0776f3fabe-bound-sa-token\") pod \"ingress-operator-5b745b69d9-gnxh7\" (UID: \"e7922021-adba-44fd-aff2-2f0776f3fabe\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gnxh7" Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.552122 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-z2fjr" Feb 24 10:20:03 crc kubenswrapper[4698]: W0224 10:20:03.558104 4698 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50f7a0ea_7b15_487b_b907_6fb4c7451eed.slice/crio-b5a8f632ae4fa745145aa5f2a7761c90f140137bbba43beabb26d170a2c65c54 WatchSource:0}: Error finding container b5a8f632ae4fa745145aa5f2a7761c90f140137bbba43beabb26d170a2c65c54: Status 404 returned error can't find the container with id b5a8f632ae4fa745145aa5f2a7761c90f140137bbba43beabb26d170a2c65c54 Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.565496 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:20:03 crc kubenswrapper[4698]: E0224 10:20:03.566307 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:20:04.066288186 +0000 UTC m=+229.179902417 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.566344 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bmbt\" (UniqueName: \"kubernetes.io/projected/b07a9333-815e-464f-afc6-28c1da857d84-kube-api-access-6bmbt\") pod \"cluster-samples-operator-665b6dd947-xcfhs\" (UID: \"b07a9333-815e-464f-afc6-28c1da857d84\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xcfhs" Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.580026 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2pb2\" (UniqueName: \"kubernetes.io/projected/c230438c-2633-4e31-b0da-b1d037e35e0c-kube-api-access-l2pb2\") pod \"packageserver-d55dfcdfc-5tdjl\" (UID: \"c230438c-2633-4e31-b0da-b1d037e35e0c\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5tdjl" Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.585618 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-z42jf" Feb 24 10:20:03 crc kubenswrapper[4698]: W0224 10:20:03.587637 4698 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod312007fb_fd23_4a36_b653_ea3e24a02ee0.slice/crio-1eb9786f18ecde1cf9b2f50ee6fbdef60282a672abb2e431b6d0e446758bdeac WatchSource:0}: Error finding container 1eb9786f18ecde1cf9b2f50ee6fbdef60282a672abb2e431b6d0e446758bdeac: Status 404 returned error can't find the container with id 1eb9786f18ecde1cf9b2f50ee6fbdef60282a672abb2e431b6d0e446758bdeac Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.592682 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xcfhs" Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.610008 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sl7fn\" (UniqueName: \"kubernetes.io/projected/916d037d-f52e-449e-8496-34695060f8d5-kube-api-access-sl7fn\") pod \"ingress-canary-6kd89\" (UID: \"916d037d-f52e-449e-8496-34695060f8d5\") " pod="openshift-ingress-canary/ingress-canary-6kd89" Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.613160 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5tdjl" Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.622735 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-shwvb" Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.622895 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggrxx\" (UniqueName: \"kubernetes.io/projected/aed98b22-f91a-4aba-ab64-65fc09af1478-kube-api-access-ggrxx\") pod \"service-ca-operator-777779d784-csv7z\" (UID: \"aed98b22-f91a-4aba-ab64-65fc09af1478\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-csv7z" Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.640228 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-vb7wk" Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.647374 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-265tl" Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.660287 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-bfmpl"] Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.662782 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhwhr\" (UniqueName: \"kubernetes.io/projected/29e7e4f9-c6e0-4a3a-8ec6-5c863c192667-kube-api-access-xhwhr\") pod \"multus-admission-controller-857f4d67dd-fd5xc\" (UID: \"29e7e4f9-c6e0-4a3a-8ec6-5c863c192667\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-fd5xc" Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.663520 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jp2q\" (UniqueName: \"kubernetes.io/projected/4a2469de-c5ab-4a39-9168-01e03bd4b1c6-kube-api-access-4jp2q\") pod \"dns-default-6h5bj\" (UID: \"4a2469de-c5ab-4a39-9168-01e03bd4b1c6\") " pod="openshift-dns/dns-default-6h5bj" Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.667160 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bmr2l\" (UID: \"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8\") " pod="openshift-image-registry/image-registry-697d97f7c8-bmr2l" Feb 24 10:20:03 crc kubenswrapper[4698]: E0224 10:20:03.668301 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:20:04.168288346 +0000 UTC m=+229.281902587 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bmr2l" (UID: "9ded6944-ff06-4cd5-beef-4dbb3cb9aba8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.668716 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-f69tr" Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.690819 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-79f62" Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.692865 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jrgwq"] Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.698387 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-csv7z" Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.699313 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sn5g8"] Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.704536 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vl72l\" (UniqueName: \"kubernetes.io/projected/f5c8edb8-fc4d-440e-94a0-116059aed6ad-kube-api-access-vl72l\") pod \"machine-config-controller-84d6567774-jzvrd\" (UID: \"f5c8edb8-fc4d-440e-94a0-116059aed6ad\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jzvrd" Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.705669 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jl59\" (UniqueName: \"kubernetes.io/projected/b681f586-b4e0-4b2a-ab97-ea20583eeb34-kube-api-access-7jl59\") pod \"catalog-operator-68c6474976-bz5kc\" (UID: \"b681f586-b4e0-4b2a-ab97-ea20583eeb34\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bz5kc" Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.719199 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-fd5xc" Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.725254 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmnnk\" (UniqueName: \"kubernetes.io/projected/b44b8c72-3ca2-4fbe-aa3f-9ab7917b1658-kube-api-access-mmnnk\") pod \"router-default-5444994796-trk7h\" (UID: \"b44b8c72-3ca2-4fbe-aa3f-9ab7917b1658\") " pod="openshift-ingress/router-default-5444994796-trk7h" Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.748027 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdt2j\" (UniqueName: \"kubernetes.io/projected/97c3a4a8-9e33-4012-9b16-5a0de6e0ace9-kube-api-access-wdt2j\") pod \"collect-profiles-29532135-2wtjd\" (UID: \"97c3a4a8-9e33-4012-9b16-5a0de6e0ace9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29532135-2wtjd" Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.762833 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rbbl\" (UniqueName: \"kubernetes.io/projected/50b7cda3-dd1c-4644-b5a6-23957a406b19-kube-api-access-4rbbl\") pod \"kube-storage-version-migrator-operator-b67b599dd-z54pv\" (UID: \"50b7cda3-dd1c-4644-b5a6-23957a406b19\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-z54pv" Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.772182 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-skflh" Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.777638 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-z54pv" Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.778079 4698 request.go:700] Waited for 1.006695439s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/serviceaccounts/machine-config-operator/token Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.778625 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:20:03 crc kubenswrapper[4698]: E0224 10:20:03.778776 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:20:04.278758283 +0000 UTC m=+229.392372524 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.779098 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bmr2l\" (UID: \"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8\") " pod="openshift-image-registry/image-registry-697d97f7c8-bmr2l" Feb 24 10:20:03 crc kubenswrapper[4698]: E0224 10:20:03.779482 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:20:04.279471139 +0000 UTC m=+229.393085380 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bmr2l" (UID: "9ded6944-ff06-4cd5-beef-4dbb3cb9aba8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.788387 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-6h5bj" Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.793591 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-6kd89" Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.794499 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzjzw\" (UniqueName: \"kubernetes.io/projected/f688bfd3-2a09-4640-8ec8-9b69cc9881c4-kube-api-access-hzjzw\") pod \"machine-config-operator-74547568cd-w5v5m\" (UID: \"f688bfd3-2a09-4640-8ec8-9b69cc9881c4\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-w5v5m" Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.838924 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9kdb\" (UniqueName: \"kubernetes.io/projected/19c1f910-f805-4151-8eb8-7a6628a62b5b-kube-api-access-w9kdb\") pod \"csi-hostpathplugin-fbt6k\" (UID: \"19c1f910-f805-4151-8eb8-7a6628a62b5b\") " pod="hostpath-provisioner/csi-hostpathplugin-fbt6k" Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.844375 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6x7rs\" (UniqueName: \"kubernetes.io/projected/e7922021-adba-44fd-aff2-2f0776f3fabe-kube-api-access-6x7rs\") pod \"ingress-operator-5b745b69d9-gnxh7\" (UID: \"e7922021-adba-44fd-aff2-2f0776f3fabe\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gnxh7" Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.844589 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-x8tds"] Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.845942 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6bh9j" event={"ID":"a09fb76f-4291-4945-8bb0-15c478a35cbf","Type":"ContainerStarted","Data":"b9c5ab493ca552282dfe0fde171efd5035a595db6f6d7fd4cef36aa9bed44cef"} Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.845984 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6bh9j" event={"ID":"a09fb76f-4291-4945-8bb0-15c478a35cbf","Type":"ContainerStarted","Data":"60f8be5ce68ac4309d562d862e98a51c579a50311d8e6f3f25ced1e8450c13cc"} Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.846106 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-snxxh"] Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.847157 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-lcjcd" event={"ID":"50f7a0ea-7b15-487b-b907-6fb4c7451eed","Type":"ContainerStarted","Data":"b5a8f632ae4fa745145aa5f2a7761c90f140137bbba43beabb26d170a2c65c54"} Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.847242 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-qzmkf"] Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.847514 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-fbt6k" Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.856310 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-5ndrw" event={"ID":"1539b772-1d04-4bc9-85f1-99a99b5d237d","Type":"ContainerStarted","Data":"aeb46aefade81bb95293245015a37a20b2becbbaeae5f3fdf533d08e638ed436"} Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.856519 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-5ndrw" event={"ID":"1539b772-1d04-4bc9-85f1-99a99b5d237d","Type":"ContainerStarted","Data":"878f58d88f95e12203b17cbcac773847b9e0410fb6726b3792bd8743cdce2e05"} Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.859402 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46p6r\" (UniqueName: \"kubernetes.io/projected/875da7ec-7eeb-4f5c-b849-73863732ebb2-kube-api-access-46p6r\") pod \"control-plane-machine-set-operator-78cbb6b69f-5582h\" (UID: \"875da7ec-7eeb-4f5c-b849-73863732ebb2\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5582h" Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.859647 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-z2fjr" event={"ID":"f8136f49-df98-42ad-98e7-93ddb91c2063","Type":"ContainerStarted","Data":"10a8aa8e46622a3e4be9ad0174c89cea0bf5c32561a659c14d13e76b82dc5efd"} Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.862109 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-v6qbj" event={"ID":"312007fb-fd23-4a36-b653-ea3e24a02ee0","Type":"ContainerStarted","Data":"1eb9786f18ecde1cf9b2f50ee6fbdef60282a672abb2e431b6d0e446758bdeac"} Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.880353 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:20:03 crc kubenswrapper[4698]: E0224 10:20:03.880551 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:20:04.380527646 +0000 UTC m=+229.494141877 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.891117 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-hxxxs"] Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.907196 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-trk7h" Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.915036 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29532135-2wtjd" Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.961493 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gnxh7" Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.978152 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bz5kc" Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.979140 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-97zrr"] Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.986437 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bmr2l\" (UID: \"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8\") " pod="openshift-image-registry/image-registry-697d97f7c8-bmr2l" Feb 24 10:20:03 crc kubenswrapper[4698]: E0224 10:20:03.988351 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:20:04.488337361 +0000 UTC m=+229.601951712 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bmr2l" (UID: "9ded6944-ff06-4cd5-beef-4dbb3cb9aba8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:03 crc kubenswrapper[4698]: I0224 10:20:03.990049 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jzvrd" Feb 24 10:20:04 crc kubenswrapper[4698]: I0224 10:20:04.032535 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-w5v5m" Feb 24 10:20:04 crc kubenswrapper[4698]: I0224 10:20:04.049660 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rkxpp"] Feb 24 10:20:04 crc kubenswrapper[4698]: I0224 10:20:04.051351 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5582h" Feb 24 10:20:04 crc kubenswrapper[4698]: I0224 10:20:04.089107 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:20:04 crc kubenswrapper[4698]: I0224 10:20:04.089367 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/38bdf14d-35ac-440b-9a16-9a4ddd53df34-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-w8bmq\" (UID: \"38bdf14d-35ac-440b-9a16-9a4ddd53df34\") " pod="openshift-controller-manager/controller-manager-879f6c89f-w8bmq" Feb 24 10:20:04 crc kubenswrapper[4698]: I0224 10:20:04.089411 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/59c9844c-00fe-42cd-add6-9ab528da273d-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-fq25r\" (UID: \"59c9844c-00fe-42cd-add6-9ab528da273d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fq25r" Feb 24 10:20:04 crc kubenswrapper[4698]: I0224 10:20:04.089495 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/38bdf14d-35ac-440b-9a16-9a4ddd53df34-serving-cert\") pod \"controller-manager-879f6c89f-w8bmq\" (UID: \"38bdf14d-35ac-440b-9a16-9a4ddd53df34\") " pod="openshift-controller-manager/controller-manager-879f6c89f-w8bmq" Feb 24 10:20:04 crc kubenswrapper[4698]: E0224 10:20:04.092489 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:20:04.592463861 +0000 UTC m=+229.706078102 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:04 crc kubenswrapper[4698]: I0224 10:20:04.093572 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/38bdf14d-35ac-440b-9a16-9a4ddd53df34-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-w8bmq\" (UID: \"38bdf14d-35ac-440b-9a16-9a4ddd53df34\") " pod="openshift-controller-manager/controller-manager-879f6c89f-w8bmq" Feb 24 10:20:04 crc kubenswrapper[4698]: I0224 10:20:04.093946 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/59c9844c-00fe-42cd-add6-9ab528da273d-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-fq25r\" (UID: \"59c9844c-00fe-42cd-add6-9ab528da273d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-fq25r" Feb 24 10:20:04 crc kubenswrapper[4698]: I0224 10:20:04.097778 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/38bdf14d-35ac-440b-9a16-9a4ddd53df34-serving-cert\") pod \"controller-manager-879f6c89f-w8bmq\" (UID: \"38bdf14d-35ac-440b-9a16-9a4ddd53df34\") " pod="openshift-controller-manager/controller-manager-879f6c89f-w8bmq" Feb 24 10:20:04 crc kubenswrapper[4698]: I0224 10:20:04.191208 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bmr2l\" (UID: \"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8\") " pod="openshift-image-registry/image-registry-697d97f7c8-bmr2l" Feb 24 10:20:04 crc kubenswrapper[4698]: E0224 10:20:04.191848 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:20:04.69183312 +0000 UTC m=+229.805447371 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bmr2l" (UID: "9ded6944-ff06-4cd5-beef-4dbb3cb9aba8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:04 crc kubenswrapper[4698]: I0224 10:20:04.216844 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-clwmh"] Feb 24 10:20:04 crc kubenswrapper[4698]: I0224 10:20:04.230827 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xcfhs"] Feb 24 10:20:04 crc kubenswrapper[4698]: I0224 10:20:04.232207 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-z42jf"] Feb 24 10:20:04 crc kubenswrapper[4698]: I0224 10:20:04.251218 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-w8bmq" Feb 24 10:20:04 crc kubenswrapper[4698]: I0224 10:20:04.281451 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-fq25r" Feb 24 10:20:04 crc kubenswrapper[4698]: I0224 10:20:04.292541 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:20:04 crc kubenswrapper[4698]: E0224 10:20:04.292671 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:20:04.792644932 +0000 UTC m=+229.906259173 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:04 crc kubenswrapper[4698]: I0224 10:20:04.292728 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bmr2l\" (UID: \"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8\") " pod="openshift-image-registry/image-registry-697d97f7c8-bmr2l" Feb 24 10:20:04 crc kubenswrapper[4698]: E0224 10:20:04.293048 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:20:04.793036611 +0000 UTC m=+229.906650852 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bmr2l" (UID: "9ded6944-ff06-4cd5-beef-4dbb3cb9aba8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:04 crc kubenswrapper[4698]: I0224 10:20:04.300422 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-lqzfp"] Feb 24 10:20:04 crc kubenswrapper[4698]: I0224 10:20:04.386445 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5tdjl"] Feb 24 10:20:04 crc kubenswrapper[4698]: I0224 10:20:04.393368 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:20:04 crc kubenswrapper[4698]: E0224 10:20:04.393536 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:20:04.893518905 +0000 UTC m=+230.007133146 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:04 crc kubenswrapper[4698]: I0224 10:20:04.393588 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bmr2l\" (UID: \"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8\") " pod="openshift-image-registry/image-registry-697d97f7c8-bmr2l" Feb 24 10:20:04 crc kubenswrapper[4698]: E0224 10:20:04.393863 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:20:04.893855383 +0000 UTC m=+230.007469624 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bmr2l" (UID: "9ded6944-ff06-4cd5-beef-4dbb3cb9aba8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:04 crc kubenswrapper[4698]: I0224 10:20:04.408078 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-265tl"] Feb 24 10:20:04 crc kubenswrapper[4698]: I0224 10:20:04.410180 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-vb7wk"] Feb 24 10:20:04 crc kubenswrapper[4698]: I0224 10:20:04.494451 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:20:04 crc kubenswrapper[4698]: E0224 10:20:04.494794 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:20:04.994779888 +0000 UTC m=+230.108394129 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:04 crc kubenswrapper[4698]: W0224 10:20:04.517051 4698 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc230438c_2633_4e31_b0da_b1d037e35e0c.slice/crio-6932037671ecb6ad40ecf9393d57d907a1d5ba17c6b7f9b7281d6e3b0a3092e3 WatchSource:0}: Error finding container 6932037671ecb6ad40ecf9393d57d907a1d5ba17c6b7f9b7281d6e3b0a3092e3: Status 404 returned error can't find the container with id 6932037671ecb6ad40ecf9393d57d907a1d5ba17c6b7f9b7281d6e3b0a3092e3 Feb 24 10:20:04 crc kubenswrapper[4698]: I0224 10:20:04.595935 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bmr2l\" (UID: \"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8\") " pod="openshift-image-registry/image-registry-697d97f7c8-bmr2l" Feb 24 10:20:04 crc kubenswrapper[4698]: E0224 10:20:04.596824 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:20:05.096808768 +0000 UTC m=+230.210423009 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bmr2l" (UID: "9ded6944-ff06-4cd5-beef-4dbb3cb9aba8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:04 crc kubenswrapper[4698]: I0224 10:20:04.697552 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:20:04 crc kubenswrapper[4698]: E0224 10:20:04.698039 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:20:05.19802346 +0000 UTC m=+230.311637701 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:04 crc kubenswrapper[4698]: I0224 10:20:04.798549 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bmr2l\" (UID: \"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8\") " pod="openshift-image-registry/image-registry-697d97f7c8-bmr2l" Feb 24 10:20:04 crc kubenswrapper[4698]: E0224 10:20:04.798866 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:20:05.298847182 +0000 UTC m=+230.412461423 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bmr2l" (UID: "9ded6944-ff06-4cd5-beef-4dbb3cb9aba8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:04 crc kubenswrapper[4698]: I0224 10:20:04.903897 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:20:04 crc kubenswrapper[4698]: E0224 10:20:04.905285 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:20:05.405232454 +0000 UTC m=+230.518846705 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:04 crc kubenswrapper[4698]: I0224 10:20:04.933458 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-qzmkf" event={"ID":"9c9967ed-20af-48cf-859d-4c3060d413fb","Type":"ContainerStarted","Data":"e523e1e3f47b43814149e588056980bba97ea6576eee2955745e6c384e705615"} Feb 24 10:20:04 crc kubenswrapper[4698]: I0224 10:20:04.941872 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jrgwq" event={"ID":"effafc66-9dae-4ef3-86a5-72e1fac84fc4","Type":"ContainerStarted","Data":"b3eac4269bddbead68b906ab532fcd9dd894e99ed74af17b7e989a5097142aa0"} Feb 24 10:20:04 crc kubenswrapper[4698]: I0224 10:20:04.949346 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-lqzfp" event={"ID":"568f96c6-6a68-4e06-a1e1-1b787f58bac7","Type":"ContainerStarted","Data":"32d9053cd733a7e04acbb5b6a8a34ab558589517e2d8ea7e7c97473d99bd13e9"} Feb 24 10:20:04 crc kubenswrapper[4698]: I0224 10:20:04.951013 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-97zrr" event={"ID":"ae1f2a5a-7833-46ff-b7e0-d8c2b70ec300","Type":"ContainerStarted","Data":"68bcf5f222b5b16e285a98f2d53c5aa3ebfd3f37f7f4c287bba6327ff4b7474c"} Feb 24 10:20:04 crc kubenswrapper[4698]: I0224 10:20:04.955765 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-x8tds" event={"ID":"6b1ad070-5898-4b3a-ab57-57d781c9b809","Type":"ContainerStarted","Data":"4f4c8cfc30b735301ba3b449a3dca6eeb9b50a32a2d28c3fd9a47c30a087503a"} Feb 24 10:20:04 crc kubenswrapper[4698]: I0224 10:20:04.964587 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5tdjl" event={"ID":"c230438c-2633-4e31-b0da-b1d037e35e0c","Type":"ContainerStarted","Data":"6932037671ecb6ad40ecf9393d57d907a1d5ba17c6b7f9b7281d6e3b0a3092e3"} Feb 24 10:20:04 crc kubenswrapper[4698]: I0224 10:20:04.975911 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-snxxh" event={"ID":"e6dc6e77-8617-4bc0-8960-6b81b87c8b88","Type":"ContainerStarted","Data":"f274ff905f8313a184b5bb3ad87c308fe54b1e2b9fcb84936c02658e6e307de1"} Feb 24 10:20:04 crc kubenswrapper[4698]: I0224 10:20:04.981759 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-clwmh" event={"ID":"348d0b48-f2a9-4326-b8c8-88f43029f382","Type":"ContainerStarted","Data":"dacc30da31312034973e2bb83ae9ea3609e119adb05b43791a1a36f3194e31cc"} Feb 24 10:20:04 crc kubenswrapper[4698]: I0224 10:20:04.982743 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-hxxxs" event={"ID":"803e0d1c-f298-49b4-9251-9271f311ee92","Type":"ContainerStarted","Data":"d71a1b93b994f47cdc66c06b07c608956dbff0035099963d465fb1cc3d85468e"} Feb 24 10:20:04 crc kubenswrapper[4698]: I0224 10:20:04.988585 4698 generic.go:334] "Generic (PLEG): container finished" podID="50f7a0ea-7b15-487b-b907-6fb4c7451eed" containerID="63e8549f18b9a4391682100ffd85c1ca8684faee3da00c8766ee117537a905f7" exitCode=0 Feb 24 10:20:04 crc kubenswrapper[4698]: I0224 10:20:04.989495 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-lcjcd" event={"ID":"50f7a0ea-7b15-487b-b907-6fb4c7451eed","Type":"ContainerDied","Data":"63e8549f18b9a4391682100ffd85c1ca8684faee3da00c8766ee117537a905f7"} Feb 24 10:20:04 crc kubenswrapper[4698]: I0224 10:20:04.993682 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sn5g8" event={"ID":"bbfa8949-aef4-4d80-8ece-7af18d74a9a0","Type":"ContainerStarted","Data":"d40f25c6f2d677e4e6ded777566a03f122bb4b37833a54bb7c6d2d8413d7d4ba"} Feb 24 10:20:04 crc kubenswrapper[4698]: I0224 10:20:04.995635 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rkxpp" event={"ID":"e3a102f6-2a75-4096-806a-7af5eca816e0","Type":"ContainerStarted","Data":"d44c5c61f20e64e191e8b3033d5e24e91c07c0bab364fc085c4a4de5cf01c22c"} Feb 24 10:20:05 crc kubenswrapper[4698]: I0224 10:20:05.001618 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-265tl" event={"ID":"133fe0e9-d231-45aa-bea8-6add237cffb4","Type":"ContainerStarted","Data":"d04da2ae423517113fd45efa16e1149f5c84c70752de34420f8236e3a71b83b5"} Feb 24 10:20:05 crc kubenswrapper[4698]: I0224 10:20:05.003582 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-vb7wk" event={"ID":"fccccb67-888f-4a34-a701-61926e9819a6","Type":"ContainerStarted","Data":"d22f4702588ff1a533ac6df5e4424c28f37d0da085b9eb6d5f2e3f5994c2a9cc"} Feb 24 10:20:05 crc kubenswrapper[4698]: I0224 10:20:05.006641 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bmr2l\" (UID: \"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8\") " pod="openshift-image-registry/image-registry-697d97f7c8-bmr2l" Feb 24 10:20:05 crc kubenswrapper[4698]: E0224 10:20:05.006913 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:20:05.506902806 +0000 UTC m=+230.620517047 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bmr2l" (UID: "9ded6944-ff06-4cd5-beef-4dbb3cb9aba8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:05 crc kubenswrapper[4698]: I0224 10:20:05.018871 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bfmpl" event={"ID":"7141f48f-7462-4e1a-a90f-b8ff3b9a8d9f","Type":"ContainerStarted","Data":"a8452b76f874839619a1fc704f7226d4521298df8401c72b5f21234bde282250"} Feb 24 10:20:05 crc kubenswrapper[4698]: I0224 10:20:05.020391 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-z42jf" event={"ID":"108d72f5-0dd9-4965-a41f-7403ad8fce04","Type":"ContainerStarted","Data":"23fc4e1ffc52578997683ea92679ee63de804c40783221fde83d589da882a9e6"} Feb 24 10:20:05 crc kubenswrapper[4698]: I0224 10:20:05.026448 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-v6qbj" event={"ID":"312007fb-fd23-4a36-b653-ea3e24a02ee0","Type":"ContainerStarted","Data":"d99ec22ba0203f5364f7af35f9bdabbc077bdf1109606c0ed533162ae306ad71"} Feb 24 10:20:05 crc kubenswrapper[4698]: I0224 10:20:05.028609 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-trk7h" event={"ID":"b44b8c72-3ca2-4fbe-aa3f-9ab7917b1658","Type":"ContainerStarted","Data":"830bd51aa577a4c2f44878bfe0a58de4d1887e10d7f71fc70c4a57cce7a813a8"} Feb 24 10:20:05 crc kubenswrapper[4698]: I0224 10:20:05.030382 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-z2fjr" event={"ID":"f8136f49-df98-42ad-98e7-93ddb91c2063","Type":"ContainerStarted","Data":"da1a4894a9ce01483d3e0fd57d26dbcf9f9807e09830c6d96bb81f657541fcdc"} Feb 24 10:20:05 crc kubenswrapper[4698]: I0224 10:20:05.100323 4698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dkf4t" podStartSLOduration=186.100309256 podStartE2EDuration="3m6.100309256s" podCreationTimestamp="2026-02-24 10:16:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:20:05.09746852 +0000 UTC m=+230.211082761" watchObservedRunningTime="2026-02-24 10:20:05.100309256 +0000 UTC m=+230.213923497" Feb 24 10:20:05 crc kubenswrapper[4698]: I0224 10:20:05.110653 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:20:05 crc kubenswrapper[4698]: E0224 10:20:05.110787 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:20:05.610758209 +0000 UTC m=+230.724372450 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:05 crc kubenswrapper[4698]: I0224 10:20:05.116449 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bmr2l\" (UID: \"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8\") " pod="openshift-image-registry/image-registry-697d97f7c8-bmr2l" Feb 24 10:20:05 crc kubenswrapper[4698]: E0224 10:20:05.117799 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:20:05.617789432 +0000 UTC m=+230.731403673 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bmr2l" (UID: "9ded6944-ff06-4cd5-beef-4dbb3cb9aba8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:05 crc kubenswrapper[4698]: I0224 10:20:05.192816 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-6kd89"] Feb 24 10:20:05 crc kubenswrapper[4698]: I0224 10:20:05.194427 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-6h5bj"] Feb 24 10:20:05 crc kubenswrapper[4698]: I0224 10:20:05.206573 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-z54pv"] Feb 24 10:20:05 crc kubenswrapper[4698]: W0224 10:20:05.207092 4698 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod916d037d_f52e_449e_8496_34695060f8d5.slice/crio-99275a7139413da86b6d0eeac8f557f0ef6d7d850659b7d5e66098760c6807c0 WatchSource:0}: Error finding container 99275a7139413da86b6d0eeac8f557f0ef6d7d850659b7d5e66098760c6807c0: Status 404 returned error can't find the container with id 99275a7139413da86b6d0eeac8f557f0ef6d7d850659b7d5e66098760c6807c0 Feb 24 10:20:05 crc kubenswrapper[4698]: I0224 10:20:05.218461 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:20:05 crc kubenswrapper[4698]: E0224 10:20:05.218856 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:20:05.71882879 +0000 UTC m=+230.832443041 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:05 crc kubenswrapper[4698]: I0224 10:20:05.219106 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bmr2l\" (UID: \"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8\") " pod="openshift-image-registry/image-registry-697d97f7c8-bmr2l" Feb 24 10:20:05 crc kubenswrapper[4698]: E0224 10:20:05.219608 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:20:05.719597858 +0000 UTC m=+230.833212099 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bmr2l" (UID: "9ded6944-ff06-4cd5-beef-4dbb3cb9aba8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:05 crc kubenswrapper[4698]: W0224 10:20:05.228977 4698 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a2469de_c5ab_4a39_9168_01e03bd4b1c6.slice/crio-82cc54b1793de5d01e41c3f90948560fb20a2df6c61fc0ee4fd2d2c6a7227db4 WatchSource:0}: Error finding container 82cc54b1793de5d01e41c3f90948560fb20a2df6c61fc0ee4fd2d2c6a7227db4: Status 404 returned error can't find the container with id 82cc54b1793de5d01e41c3f90948560fb20a2df6c61fc0ee4fd2d2c6a7227db4 Feb 24 10:20:05 crc kubenswrapper[4698]: I0224 10:20:05.243275 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-skflh"] Feb 24 10:20:05 crc kubenswrapper[4698]: I0224 10:20:05.249185 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-csv7z"] Feb 24 10:20:05 crc kubenswrapper[4698]: I0224 10:20:05.260674 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5582h"] Feb 24 10:20:05 crc kubenswrapper[4698]: I0224 10:20:05.264415 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-w8bmq"] Feb 24 10:20:05 crc kubenswrapper[4698]: I0224 10:20:05.280247 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-gnxh7"] Feb 24 10:20:05 crc kubenswrapper[4698]: I0224 10:20:05.280885 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-79f62"] Feb 24 10:20:05 crc kubenswrapper[4698]: I0224 10:20:05.287237 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-shwvb"] Feb 24 10:20:05 crc kubenswrapper[4698]: I0224 10:20:05.293583 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-w5v5m"] Feb 24 10:20:05 crc kubenswrapper[4698]: I0224 10:20:05.298363 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-fbt6k"] Feb 24 10:20:05 crc kubenswrapper[4698]: W0224 10:20:05.303601 4698 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaed98b22_f91a_4aba_ab64_65fc09af1478.slice/crio-9aa4dcdbd7bd3b63b51ce74e2c81b3c0c6e328e464e8ad0ae844d7c85842fefd WatchSource:0}: Error finding container 9aa4dcdbd7bd3b63b51ce74e2c81b3c0c6e328e464e8ad0ae844d7c85842fefd: Status 404 returned error can't find the container with id 9aa4dcdbd7bd3b63b51ce74e2c81b3c0c6e328e464e8ad0ae844d7c85842fefd Feb 24 10:20:05 crc kubenswrapper[4698]: W0224 10:20:05.303920 4698 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod875da7ec_7eeb_4f5c_b849_73863732ebb2.slice/crio-a5702d09576790c3d4fe35efc17990611fd8ee2b31cce2854bfbced786c13366 WatchSource:0}: Error finding container a5702d09576790c3d4fe35efc17990611fd8ee2b31cce2854bfbced786c13366: Status 404 returned error can't find the container with id a5702d09576790c3d4fe35efc17990611fd8ee2b31cce2854bfbced786c13366 Feb 24 10:20:05 crc kubenswrapper[4698]: I0224 10:20:05.308309 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bz5kc"] Feb 24 10:20:05 crc kubenswrapper[4698]: W0224 10:20:05.311987 4698 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38bdf14d_35ac_440b_9a16_9a4ddd53df34.slice/crio-7f6aa9a1f37035e1ace1fe1d70b4313c6f79e1d8550e2f9e8fa80e3d07fd539b WatchSource:0}: Error finding container 7f6aa9a1f37035e1ace1fe1d70b4313c6f79e1d8550e2f9e8fa80e3d07fd539b: Status 404 returned error can't find the container with id 7f6aa9a1f37035e1ace1fe1d70b4313c6f79e1d8550e2f9e8fa80e3d07fd539b Feb 24 10:20:05 crc kubenswrapper[4698]: I0224 10:20:05.312520 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-f69tr"] Feb 24 10:20:05 crc kubenswrapper[4698]: I0224 10:20:05.318922 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-fq25r"] Feb 24 10:20:05 crc kubenswrapper[4698]: I0224 10:20:05.320116 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:20:05 crc kubenswrapper[4698]: E0224 10:20:05.320560 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:20:05.820539763 +0000 UTC m=+230.934154004 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:05 crc kubenswrapper[4698]: I0224 10:20:05.321448 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-fd5xc"] Feb 24 10:20:05 crc kubenswrapper[4698]: I0224 10:20:05.323574 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bmr2l\" (UID: \"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8\") " pod="openshift-image-registry/image-registry-697d97f7c8-bmr2l" Feb 24 10:20:05 crc kubenswrapper[4698]: E0224 10:20:05.323913 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:20:05.823901041 +0000 UTC m=+230.937515282 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bmr2l" (UID: "9ded6944-ff06-4cd5-beef-4dbb3cb9aba8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:05 crc kubenswrapper[4698]: I0224 10:20:05.324300 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29532135-2wtjd"] Feb 24 10:20:05 crc kubenswrapper[4698]: I0224 10:20:05.326438 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-jzvrd"] Feb 24 10:20:05 crc kubenswrapper[4698]: W0224 10:20:05.328447 4698 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0de08e0_63c0_4a90_a264_1bc41b8746d8.slice/crio-cbd4ce812626fa202a05f89f1758daf6ee0a36e4e8b52e7242c48fc61220faa8 WatchSource:0}: Error finding container cbd4ce812626fa202a05f89f1758daf6ee0a36e4e8b52e7242c48fc61220faa8: Status 404 returned error can't find the container with id cbd4ce812626fa202a05f89f1758daf6ee0a36e4e8b52e7242c48fc61220faa8 Feb 24 10:20:05 crc kubenswrapper[4698]: W0224 10:20:05.333494 4698 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod19c1f910_f805_4151_8eb8_7a6628a62b5b.slice/crio-26199c8020a5c267a623a2e0b6ce2856a1a85641ab0a00360570afcc131dcc22 WatchSource:0}: Error finding container 26199c8020a5c267a623a2e0b6ce2856a1a85641ab0a00360570afcc131dcc22: Status 404 returned error can't find the container with id 26199c8020a5c267a623a2e0b6ce2856a1a85641ab0a00360570afcc131dcc22 Feb 24 10:20:05 crc kubenswrapper[4698]: I0224 10:20:05.424890 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:20:05 crc kubenswrapper[4698]: E0224 10:20:05.425056 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:20:05.92503395 +0000 UTC m=+231.038648191 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:05 crc kubenswrapper[4698]: I0224 10:20:05.425412 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bmr2l\" (UID: \"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8\") " pod="openshift-image-registry/image-registry-697d97f7c8-bmr2l" Feb 24 10:20:05 crc kubenswrapper[4698]: E0224 10:20:05.425824 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:20:05.925809589 +0000 UTC m=+231.039423830 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bmr2l" (UID: "9ded6944-ff06-4cd5-beef-4dbb3cb9aba8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:05 crc kubenswrapper[4698]: I0224 10:20:05.526996 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:20:05 crc kubenswrapper[4698]: E0224 10:20:05.527131 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:20:06.027103052 +0000 UTC m=+231.140717303 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:05 crc kubenswrapper[4698]: I0224 10:20:05.527180 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bmr2l\" (UID: \"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8\") " pod="openshift-image-registry/image-registry-697d97f7c8-bmr2l" Feb 24 10:20:05 crc kubenswrapper[4698]: E0224 10:20:05.527726 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:20:06.027716216 +0000 UTC m=+231.141330467 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bmr2l" (UID: "9ded6944-ff06-4cd5-beef-4dbb3cb9aba8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:05 crc kubenswrapper[4698]: I0224 10:20:05.539068 4698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-6bh9j" podStartSLOduration=187.53904823 podStartE2EDuration="3m7.53904823s" podCreationTimestamp="2026-02-24 10:16:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:20:05.537889603 +0000 UTC m=+230.651503844" watchObservedRunningTime="2026-02-24 10:20:05.53904823 +0000 UTC m=+230.652662471" Feb 24 10:20:05 crc kubenswrapper[4698]: I0224 10:20:05.620974 4698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-5ndrw" podStartSLOduration=186.620960642 podStartE2EDuration="3m6.620960642s" podCreationTimestamp="2026-02-24 10:16:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:20:05.620503452 +0000 UTC m=+230.734117693" watchObservedRunningTime="2026-02-24 10:20:05.620960642 +0000 UTC m=+230.734574883" Feb 24 10:20:05 crc kubenswrapper[4698]: I0224 10:20:05.628091 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:20:05 crc kubenswrapper[4698]: E0224 10:20:05.628229 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:20:06.128212071 +0000 UTC m=+231.241826312 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:05 crc kubenswrapper[4698]: I0224 10:20:05.628546 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bmr2l\" (UID: \"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8\") " pod="openshift-image-registry/image-registry-697d97f7c8-bmr2l" Feb 24 10:20:05 crc kubenswrapper[4698]: E0224 10:20:05.628827 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:20:06.128820085 +0000 UTC m=+231.242434326 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bmr2l" (UID: "9ded6944-ff06-4cd5-beef-4dbb3cb9aba8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:05 crc kubenswrapper[4698]: I0224 10:20:05.675102 4698 ???:1] "http: TLS handshake error from 192.168.126.11:55110: no serving certificate available for the kubelet" Feb 24 10:20:05 crc kubenswrapper[4698]: I0224 10:20:05.693747 4698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-z2fjr" podStartSLOduration=5.693729443 podStartE2EDuration="5.693729443s" podCreationTimestamp="2026-02-24 10:20:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:20:05.693079928 +0000 UTC m=+230.806694189" watchObservedRunningTime="2026-02-24 10:20:05.693729443 +0000 UTC m=+230.807343684" Feb 24 10:20:05 crc kubenswrapper[4698]: I0224 10:20:05.709885 4698 ???:1] "http: TLS handshake error from 192.168.126.11:55120: no serving certificate available for the kubelet" Feb 24 10:20:05 crc kubenswrapper[4698]: I0224 10:20:05.729388 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:20:05 crc kubenswrapper[4698]: E0224 10:20:05.729725 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:20:06.229699489 +0000 UTC m=+231.343313730 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:05 crc kubenswrapper[4698]: I0224 10:20:05.793460 4698 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dkf4t" Feb 24 10:20:05 crc kubenswrapper[4698]: I0224 10:20:05.793750 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dkf4t" Feb 24 10:20:05 crc kubenswrapper[4698]: I0224 10:20:05.800462 4698 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dkf4t" Feb 24 10:20:05 crc kubenswrapper[4698]: I0224 10:20:05.809866 4698 ???:1] "http: TLS handshake error from 192.168.126.11:55126: no serving certificate available for the kubelet" Feb 24 10:20:05 crc kubenswrapper[4698]: I0224 10:20:05.830565 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bmr2l\" (UID: \"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8\") " pod="openshift-image-registry/image-registry-697d97f7c8-bmr2l" Feb 24 10:20:05 crc kubenswrapper[4698]: E0224 10:20:05.830849 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:20:06.330838218 +0000 UTC m=+231.444452459 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bmr2l" (UID: "9ded6944-ff06-4cd5-beef-4dbb3cb9aba8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:05 crc kubenswrapper[4698]: I0224 10:20:05.910931 4698 ???:1] "http: TLS handshake error from 192.168.126.11:55136: no serving certificate available for the kubelet" Feb 24 10:20:05 crc kubenswrapper[4698]: I0224 10:20:05.931654 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:20:05 crc kubenswrapper[4698]: E0224 10:20:05.932570 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:20:06.432552242 +0000 UTC m=+231.546166483 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:06 crc kubenswrapper[4698]: I0224 10:20:06.011142 4698 ???:1] "http: TLS handshake error from 192.168.126.11:55140: no serving certificate available for the kubelet" Feb 24 10:20:06 crc kubenswrapper[4698]: I0224 10:20:06.033686 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bmr2l\" (UID: \"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8\") " pod="openshift-image-registry/image-registry-697d97f7c8-bmr2l" Feb 24 10:20:06 crc kubenswrapper[4698]: E0224 10:20:06.034378 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:20:06.534349857 +0000 UTC m=+231.647964138 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bmr2l" (UID: "9ded6944-ff06-4cd5-beef-4dbb3cb9aba8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:06 crc kubenswrapper[4698]: I0224 10:20:06.038528 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bz5kc" event={"ID":"b681f586-b4e0-4b2a-ab97-ea20583eeb34","Type":"ContainerStarted","Data":"6795b36a1b482880a4178f9d3138700bf6331fcfa98bcd956ac0116392c9b094"} Feb 24 10:20:06 crc kubenswrapper[4698]: I0224 10:20:06.040078 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jrgwq" event={"ID":"effafc66-9dae-4ef3-86a5-72e1fac84fc4","Type":"ContainerStarted","Data":"213f68bb2bca031f920248df189b681ab7392a54643c7542936720e3b66eab10"} Feb 24 10:20:06 crc kubenswrapper[4698]: I0224 10:20:06.040782 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5582h" event={"ID":"875da7ec-7eeb-4f5c-b849-73863732ebb2","Type":"ContainerStarted","Data":"a5702d09576790c3d4fe35efc17990611fd8ee2b31cce2854bfbced786c13366"} Feb 24 10:20:06 crc kubenswrapper[4698]: I0224 10:20:06.045246 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-shwvb" event={"ID":"04421fdc-439e-4b78-b6ce-fcf8957ddf92","Type":"ContainerStarted","Data":"0c26eefa9e4a75eeeeb6ef8328a5f8480b57ef345a443cd53ab07551e10cf980"} Feb 24 10:20:06 crc kubenswrapper[4698]: I0224 10:20:06.046545 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rkxpp" event={"ID":"e3a102f6-2a75-4096-806a-7af5eca816e0","Type":"ContainerStarted","Data":"9bb0ffd99c065f0f325b05aea50d9f7418131922d7f2496286f704627fff82e4"} Feb 24 10:20:06 crc kubenswrapper[4698]: I0224 10:20:06.047207 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-w8bmq" event={"ID":"38bdf14d-35ac-440b-9a16-9a4ddd53df34","Type":"ContainerStarted","Data":"7f6aa9a1f37035e1ace1fe1d70b4313c6f79e1d8550e2f9e8fa80e3d07fd539b"} Feb 24 10:20:06 crc kubenswrapper[4698]: I0224 10:20:06.048187 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-qzmkf" event={"ID":"9c9967ed-20af-48cf-859d-4c3060d413fb","Type":"ContainerStarted","Data":"b5eac6d3f1c597b04f29924c81d7f6b8301313569d2f184c2b32ada05ae9b493"} Feb 24 10:20:06 crc kubenswrapper[4698]: I0224 10:20:06.049209 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-qzmkf" Feb 24 10:20:06 crc kubenswrapper[4698]: I0224 10:20:06.050133 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jzvrd" event={"ID":"f5c8edb8-fc4d-440e-94a0-116059aed6ad","Type":"ContainerStarted","Data":"5d09f8090239bce216d9d6baeadf3533ffd5833050695a79f796dbd7fa6d4ab3"} Feb 24 10:20:06 crc kubenswrapper[4698]: I0224 10:20:06.050485 4698 patch_prober.go:28] interesting pod/console-operator-58897d9998-qzmkf container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.19:8443/readyz\": dial tcp 10.217.0.19:8443: connect: connection refused" start-of-body= Feb 24 10:20:06 crc kubenswrapper[4698]: I0224 10:20:06.050548 4698 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-qzmkf" podUID="9c9967ed-20af-48cf-859d-4c3060d413fb" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.19:8443/readyz\": dial tcp 10.217.0.19:8443: connect: connection refused" Feb 24 10:20:06 crc kubenswrapper[4698]: I0224 10:20:06.052952 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-vb7wk" event={"ID":"fccccb67-888f-4a34-a701-61926e9819a6","Type":"ContainerStarted","Data":"fe388f12ba12a21e812e17b2390f69b00c723d2f9a8cbfbcba7fbd263fcb7349"} Feb 24 10:20:06 crc kubenswrapper[4698]: I0224 10:20:06.058629 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5tdjl" event={"ID":"c230438c-2633-4e31-b0da-b1d037e35e0c","Type":"ContainerStarted","Data":"17b232dfdede9fe070a476ba7baf0ffaf5640795638c022ff271ec6f5f879f71"} Feb 24 10:20:06 crc kubenswrapper[4698]: I0224 10:20:06.059359 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5tdjl" Feb 24 10:20:06 crc kubenswrapper[4698]: I0224 10:20:06.062669 4698 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-5tdjl container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.30:5443/healthz\": dial tcp 10.217.0.30:5443: connect: connection refused" start-of-body= Feb 24 10:20:06 crc kubenswrapper[4698]: I0224 10:20:06.062733 4698 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5tdjl" podUID="c230438c-2633-4e31-b0da-b1d037e35e0c" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.30:5443/healthz\": dial tcp 10.217.0.30:5443: connect: connection refused" Feb 24 10:20:06 crc kubenswrapper[4698]: I0224 10:20:06.063929 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gnxh7" event={"ID":"e7922021-adba-44fd-aff2-2f0776f3fabe","Type":"ContainerStarted","Data":"8816c1d94abee24b33c9f4cf848b4ef309ff46a8be3dc4e88837138635de80ec"} Feb 24 10:20:06 crc kubenswrapper[4698]: I0224 10:20:06.075495 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-clwmh" event={"ID":"348d0b48-f2a9-4326-b8c8-88f43029f382","Type":"ContainerStarted","Data":"4567d880273b6a62d662451d231a49bf62ac1c58e64b8126c1c9d7fdd99f95f8"} Feb 24 10:20:06 crc kubenswrapper[4698]: I0224 10:20:06.080977 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-f69tr" event={"ID":"c45160a9-f0cb-4b39-ad29-67c14871973f","Type":"ContainerStarted","Data":"b1bf8aab005c18704e897ce7680134aefa0459b7c724ba8a69d1d33e1b8be77c"} Feb 24 10:20:06 crc kubenswrapper[4698]: I0224 10:20:06.083197 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29532135-2wtjd" event={"ID":"97c3a4a8-9e33-4012-9b16-5a0de6e0ace9","Type":"ContainerStarted","Data":"569095d4310b2fb137e5e2990855206a091ce46a80655a67c8377af3042a0199"} Feb 24 10:20:06 crc kubenswrapper[4698]: I0224 10:20:06.084577 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-snxxh" event={"ID":"e6dc6e77-8617-4bc0-8960-6b81b87c8b88","Type":"ContainerStarted","Data":"33c97ed26b7887929af068e6bc963b73e8b50387aecb273721885ba3f64fabd6"} Feb 24 10:20:06 crc kubenswrapper[4698]: I0224 10:20:06.089385 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-fbt6k" event={"ID":"19c1f910-f805-4151-8eb8-7a6628a62b5b","Type":"ContainerStarted","Data":"26199c8020a5c267a623a2e0b6ce2856a1a85641ab0a00360570afcc131dcc22"} Feb 24 10:20:06 crc kubenswrapper[4698]: I0224 10:20:06.119565 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-trk7h" event={"ID":"b44b8c72-3ca2-4fbe-aa3f-9ab7917b1658","Type":"ContainerStarted","Data":"5ccd50ac2f4821dd22c9e23dccd92495cb5eeb92bed17296942731bde67030c9"} Feb 24 10:20:06 crc kubenswrapper[4698]: I0224 10:20:06.131892 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xcfhs" event={"ID":"b07a9333-815e-464f-afc6-28c1da857d84","Type":"ContainerStarted","Data":"210fcec8d92c259340af7d13ac2940ba0a679464d3edad496173d3bc3d75a01b"} Feb 24 10:20:06 crc kubenswrapper[4698]: I0224 10:20:06.135497 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:20:06 crc kubenswrapper[4698]: E0224 10:20:06.136988 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:20:06.636968751 +0000 UTC m=+231.750582992 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:06 crc kubenswrapper[4698]: I0224 10:20:06.150483 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-z54pv" event={"ID":"50b7cda3-dd1c-4644-b5a6-23957a406b19","Type":"ContainerStarted","Data":"0d84bdbd168e1d92b84d86d921dcef40410679a94df56fcbc81c44fcf6b1774a"} Feb 24 10:20:06 crc kubenswrapper[4698]: I0224 10:20:06.151770 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sn5g8" event={"ID":"bbfa8949-aef4-4d80-8ece-7af18d74a9a0","Type":"ContainerStarted","Data":"2a98fcacd941b5b5e8f5cd65561fe991c9d94460a0d94a5820c8337c24273c1e"} Feb 24 10:20:06 crc kubenswrapper[4698]: I0224 10:20:06.154348 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sn5g8" Feb 24 10:20:06 crc kubenswrapper[4698]: I0224 10:20:06.158190 4698 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-sn5g8 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.40:8443/healthz\": dial tcp 10.217.0.40:8443: connect: connection refused" start-of-body= Feb 24 10:20:06 crc kubenswrapper[4698]: I0224 10:20:06.158431 4698 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sn5g8" podUID="bbfa8949-aef4-4d80-8ece-7af18d74a9a0" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.40:8443/healthz\": dial tcp 10.217.0.40:8443: connect: connection refused" Feb 24 10:20:06 crc kubenswrapper[4698]: I0224 10:20:06.181612 4698 ???:1] "http: TLS handshake error from 192.168.126.11:55146: no serving certificate available for the kubelet" Feb 24 10:20:06 crc kubenswrapper[4698]: I0224 10:20:06.205791 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-z42jf" event={"ID":"108d72f5-0dd9-4965-a41f-7403ad8fce04","Type":"ContainerStarted","Data":"34254fe153c37c2de36dbd7cc42747f42149dacd16b6044db1672740dbeaf1a9"} Feb 24 10:20:06 crc kubenswrapper[4698]: I0224 10:20:06.206747 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-z42jf" Feb 24 10:20:06 crc kubenswrapper[4698]: I0224 10:20:06.221089 4698 patch_prober.go:28] interesting pod/downloads-7954f5f757-z42jf container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" start-of-body= Feb 24 10:20:06 crc kubenswrapper[4698]: I0224 10:20:06.221143 4698 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-z42jf" podUID="108d72f5-0dd9-4965-a41f-7403ad8fce04" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" Feb 24 10:20:06 crc kubenswrapper[4698]: I0224 10:20:06.237253 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bmr2l\" (UID: \"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8\") " pod="openshift-image-registry/image-registry-697d97f7c8-bmr2l" Feb 24 10:20:06 crc kubenswrapper[4698]: E0224 10:20:06.239200 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:20:06.739172736 +0000 UTC m=+231.852786977 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bmr2l" (UID: "9ded6944-ff06-4cd5-beef-4dbb3cb9aba8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:06 crc kubenswrapper[4698]: I0224 10:20:06.263044 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-w5v5m" event={"ID":"f688bfd3-2a09-4640-8ec8-9b69cc9881c4","Type":"ContainerStarted","Data":"fef43ac499e64c7f451a65308e7aae10a6a06efa1bb633f02dadfe99c01d0575"} Feb 24 10:20:06 crc kubenswrapper[4698]: I0224 10:20:06.277726 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-97zrr" event={"ID":"ae1f2a5a-7833-46ff-b7e0-d8c2b70ec300","Type":"ContainerStarted","Data":"b3fa42ae734521152f9dc4d31a8127c5d6184ececa69ae22bbeedbed67e2ce12"} Feb 24 10:20:06 crc kubenswrapper[4698]: I0224 10:20:06.279712 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-6kd89" event={"ID":"916d037d-f52e-449e-8496-34695060f8d5","Type":"ContainerStarted","Data":"99275a7139413da86b6d0eeac8f557f0ef6d7d850659b7d5e66098760c6807c0"} Feb 24 10:20:06 crc kubenswrapper[4698]: I0224 10:20:06.294740 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-79f62" event={"ID":"d0de08e0-63c0-4a90-a264-1bc41b8746d8","Type":"ContainerStarted","Data":"cbd4ce812626fa202a05f89f1758daf6ee0a36e4e8b52e7242c48fc61220faa8"} Feb 24 10:20:06 crc kubenswrapper[4698]: I0224 10:20:06.297643 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-csv7z" event={"ID":"aed98b22-f91a-4aba-ab64-65fc09af1478","Type":"ContainerStarted","Data":"9aa4dcdbd7bd3b63b51ce74e2c81b3c0c6e328e464e8ad0ae844d7c85842fefd"} Feb 24 10:20:06 crc kubenswrapper[4698]: I0224 10:20:06.299028 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bfmpl" event={"ID":"7141f48f-7462-4e1a-a90f-b8ff3b9a8d9f","Type":"ContainerStarted","Data":"42c76dca03b0dc1d84b11b5d0a93719eaccc1e2cfed76f09c96ba776a1105642"} Feb 24 10:20:06 crc kubenswrapper[4698]: I0224 10:20:06.299336 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bfmpl" Feb 24 10:20:06 crc kubenswrapper[4698]: I0224 10:20:06.300816 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-fq25r" event={"ID":"59c9844c-00fe-42cd-add6-9ab528da273d","Type":"ContainerStarted","Data":"ef093e9580544d29c68c54584fcf092ea852e0eede34926a3ae54d9f84fb1718"} Feb 24 10:20:06 crc kubenswrapper[4698]: I0224 10:20:06.301430 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-skflh" event={"ID":"0afb3450-a53f-4b58-9472-7bbec9f4eb54","Type":"ContainerStarted","Data":"47f96b82beae0553fb4dec4b4ce14f8d9afb899730a25f17672294e87eb5b1af"} Feb 24 10:20:06 crc kubenswrapper[4698]: I0224 10:20:06.301920 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6h5bj" event={"ID":"4a2469de-c5ab-4a39-9168-01e03bd4b1c6","Type":"ContainerStarted","Data":"82cc54b1793de5d01e41c3f90948560fb20a2df6c61fc0ee4fd2d2c6a7227db4"} Feb 24 10:20:06 crc kubenswrapper[4698]: I0224 10:20:06.303389 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-hxxxs" event={"ID":"803e0d1c-f298-49b4-9251-9271f311ee92","Type":"ContainerStarted","Data":"efa081e04ddcd841864dd677845a74dfc001ec419317a95d060c9074e704a5f0"} Feb 24 10:20:06 crc kubenswrapper[4698]: I0224 10:20:06.303884 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-hxxxs" Feb 24 10:20:06 crc kubenswrapper[4698]: I0224 10:20:06.306349 4698 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-hxxxs container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.26:6443/healthz\": dial tcp 10.217.0.26:6443: connect: connection refused" start-of-body= Feb 24 10:20:06 crc kubenswrapper[4698]: I0224 10:20:06.306601 4698 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-hxxxs" podUID="803e0d1c-f298-49b4-9251-9271f311ee92" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.26:6443/healthz\": dial tcp 10.217.0.26:6443: connect: connection refused" Feb 24 10:20:06 crc kubenswrapper[4698]: I0224 10:20:06.306928 4698 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-bfmpl container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Feb 24 10:20:06 crc kubenswrapper[4698]: I0224 10:20:06.306957 4698 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bfmpl" podUID="7141f48f-7462-4e1a-a90f-b8ff3b9a8d9f" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" Feb 24 10:20:06 crc kubenswrapper[4698]: I0224 10:20:06.311245 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-fd5xc" event={"ID":"29e7e4f9-c6e0-4a3a-8ec6-5c863c192667","Type":"ContainerStarted","Data":"085d09e9024eb512beadfca937c9cd85e7bb55e3ac6b355548b9fda3ec3c2db0"} Feb 24 10:20:06 crc kubenswrapper[4698]: I0224 10:20:06.318538 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-265tl" event={"ID":"133fe0e9-d231-45aa-bea8-6add237cffb4","Type":"ContainerStarted","Data":"dbdb6fa2c32133fe920a859a19439d2113c2d5dc63a89c28231378fc2678ef21"} Feb 24 10:20:06 crc kubenswrapper[4698]: I0224 10:20:06.329199 4698 generic.go:334] "Generic (PLEG): container finished" podID="568f96c6-6a68-4e06-a1e1-1b787f58bac7" containerID="2cd07ad5d354e221c5fa87d12b67b10c11fd25e109f3549965884172c437ac6b" exitCode=0 Feb 24 10:20:06 crc kubenswrapper[4698]: I0224 10:20:06.329349 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-lqzfp" event={"ID":"568f96c6-6a68-4e06-a1e1-1b787f58bac7","Type":"ContainerDied","Data":"2cd07ad5d354e221c5fa87d12b67b10c11fd25e109f3549965884172c437ac6b"} Feb 24 10:20:06 crc kubenswrapper[4698]: I0224 10:20:06.331589 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-x8tds" event={"ID":"6b1ad070-5898-4b3a-ab57-57d781c9b809","Type":"ContainerStarted","Data":"9e045b98695f05888ac4cf9e0cd56f003eaba4d6957002b71c8cc018a46bae39"} Feb 24 10:20:06 crc kubenswrapper[4698]: I0224 10:20:06.339914 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:20:06 crc kubenswrapper[4698]: E0224 10:20:06.341137 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:20:06.841114154 +0000 UTC m=+231.954728395 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:06 crc kubenswrapper[4698]: I0224 10:20:06.342515 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-dkf4t" Feb 24 10:20:06 crc kubenswrapper[4698]: I0224 10:20:06.397277 4698 ???:1] "http: TLS handshake error from 192.168.126.11:55162: no serving certificate available for the kubelet" Feb 24 10:20:06 crc kubenswrapper[4698]: I0224 10:20:06.441248 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bmr2l\" (UID: \"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8\") " pod="openshift-image-registry/image-registry-697d97f7c8-bmr2l" Feb 24 10:20:06 crc kubenswrapper[4698]: E0224 10:20:06.441895 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:20:06.941875675 +0000 UTC m=+232.055489916 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bmr2l" (UID: "9ded6944-ff06-4cd5-beef-4dbb3cb9aba8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:06 crc kubenswrapper[4698]: I0224 10:20:06.540366 4698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sn5g8" podStartSLOduration=187.540348763 podStartE2EDuration="3m7.540348763s" podCreationTimestamp="2026-02-24 10:16:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:20:06.539177016 +0000 UTC m=+231.652791257" watchObservedRunningTime="2026-02-24 10:20:06.540348763 +0000 UTC m=+231.653963004" Feb 24 10:20:06 crc kubenswrapper[4698]: I0224 10:20:06.543649 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:20:06 crc kubenswrapper[4698]: E0224 10:20:06.543813 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:20:07.043789652 +0000 UTC m=+232.157403893 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:06 crc kubenswrapper[4698]: I0224 10:20:06.543986 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bmr2l\" (UID: \"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8\") " pod="openshift-image-registry/image-registry-697d97f7c8-bmr2l" Feb 24 10:20:06 crc kubenswrapper[4698]: E0224 10:20:06.544545 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:20:07.04452773 +0000 UTC m=+232.158141971 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bmr2l" (UID: "9ded6944-ff06-4cd5-beef-4dbb3cb9aba8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:06 crc kubenswrapper[4698]: I0224 10:20:06.579737 4698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-z42jf" podStartSLOduration=188.579719488 podStartE2EDuration="3m8.579719488s" podCreationTimestamp="2026-02-24 10:16:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:20:06.578884718 +0000 UTC m=+231.692498959" watchObservedRunningTime="2026-02-24 10:20:06.579719488 +0000 UTC m=+231.693333739" Feb 24 10:20:06 crc kubenswrapper[4698]: I0224 10:20:06.645016 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:20:06 crc kubenswrapper[4698]: E0224 10:20:06.645219 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:20:07.145186389 +0000 UTC m=+232.258800630 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:06 crc kubenswrapper[4698]: I0224 10:20:06.645883 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bmr2l\" (UID: \"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8\") " pod="openshift-image-registry/image-registry-697d97f7c8-bmr2l" Feb 24 10:20:06 crc kubenswrapper[4698]: E0224 10:20:06.646314 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:20:07.146297955 +0000 UTC m=+232.259912196 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bmr2l" (UID: "9ded6944-ff06-4cd5-beef-4dbb3cb9aba8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:06 crc kubenswrapper[4698]: I0224 10:20:06.659330 4698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5tdjl" podStartSLOduration=187.659304077 podStartE2EDuration="3m7.659304077s" podCreationTimestamp="2026-02-24 10:16:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:20:06.658432396 +0000 UTC m=+231.772046637" watchObservedRunningTime="2026-02-24 10:20:06.659304077 +0000 UTC m=+231.772918318" Feb 24 10:20:06 crc kubenswrapper[4698]: I0224 10:20:06.660787 4698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bfmpl" podStartSLOduration=187.660779611 podStartE2EDuration="3m7.660779611s" podCreationTimestamp="2026-02-24 10:16:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:20:06.61768512 +0000 UTC m=+231.731299361" watchObservedRunningTime="2026-02-24 10:20:06.660779611 +0000 UTC m=+231.774393852" Feb 24 10:20:06 crc kubenswrapper[4698]: I0224 10:20:06.700600 4698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-clwmh" podStartSLOduration=188.700582355 podStartE2EDuration="3m8.700582355s" podCreationTimestamp="2026-02-24 10:16:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:20:06.69730896 +0000 UTC m=+231.810923211" watchObservedRunningTime="2026-02-24 10:20:06.700582355 +0000 UTC m=+231.814196596" Feb 24 10:20:06 crc kubenswrapper[4698]: I0224 10:20:06.762155 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:20:06 crc kubenswrapper[4698]: E0224 10:20:06.762305 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:20:07.262285639 +0000 UTC m=+232.375899880 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:06 crc kubenswrapper[4698]: I0224 10:20:06.765688 4698 ???:1] "http: TLS handshake error from 192.168.126.11:55164: no serving certificate available for the kubelet" Feb 24 10:20:06 crc kubenswrapper[4698]: I0224 10:20:06.772316 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bmr2l\" (UID: \"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8\") " pod="openshift-image-registry/image-registry-697d97f7c8-bmr2l" Feb 24 10:20:06 crc kubenswrapper[4698]: E0224 10:20:06.772834 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:20:07.272819274 +0000 UTC m=+232.386433515 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bmr2l" (UID: "9ded6944-ff06-4cd5-beef-4dbb3cb9aba8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:06 crc kubenswrapper[4698]: I0224 10:20:06.785350 4698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jrgwq" podStartSLOduration=188.785330285 podStartE2EDuration="3m8.785330285s" podCreationTimestamp="2026-02-24 10:16:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:20:06.75930287 +0000 UTC m=+231.872917131" watchObservedRunningTime="2026-02-24 10:20:06.785330285 +0000 UTC m=+231.898944526" Feb 24 10:20:06 crc kubenswrapper[4698]: I0224 10:20:06.787031 4698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-trk7h" podStartSLOduration=187.787024034 podStartE2EDuration="3m7.787024034s" podCreationTimestamp="2026-02-24 10:16:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:20:06.78255514 +0000 UTC m=+231.896169381" watchObservedRunningTime="2026-02-24 10:20:06.787024034 +0000 UTC m=+231.900638275" Feb 24 10:20:06 crc kubenswrapper[4698]: I0224 10:20:06.838720 4698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-265tl" podStartSLOduration=187.838685014 podStartE2EDuration="3m7.838685014s" podCreationTimestamp="2026-02-24 10:16:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:20:06.82428682 +0000 UTC m=+231.937901071" watchObservedRunningTime="2026-02-24 10:20:06.838685014 +0000 UTC m=+231.952299275" Feb 24 10:20:06 crc kubenswrapper[4698]: I0224 10:20:06.857434 4698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-x8tds" podStartSLOduration=188.857404569 podStartE2EDuration="3m8.857404569s" podCreationTimestamp="2026-02-24 10:16:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:20:06.855902234 +0000 UTC m=+231.969516475" watchObservedRunningTime="2026-02-24 10:20:06.857404569 +0000 UTC m=+231.971018810" Feb 24 10:20:06 crc kubenswrapper[4698]: I0224 10:20:06.874251 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:20:06 crc kubenswrapper[4698]: E0224 10:20:06.874571 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:20:07.374554238 +0000 UTC m=+232.488168479 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:06 crc kubenswrapper[4698]: I0224 10:20:06.909564 4698 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-trk7h" Feb 24 10:20:06 crc kubenswrapper[4698]: I0224 10:20:06.911091 4698 patch_prober.go:28] interesting pod/router-default-5444994796-trk7h container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Feb 24 10:20:06 crc kubenswrapper[4698]: I0224 10:20:06.911137 4698 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-trk7h" podUID="b44b8c72-3ca2-4fbe-aa3f-9ab7917b1658" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Feb 24 10:20:06 crc kubenswrapper[4698]: I0224 10:20:06.941695 4698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-hxxxs" podStartSLOduration=188.941676238 podStartE2EDuration="3m8.941676238s" podCreationTimestamp="2026-02-24 10:16:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:20:06.94049206 +0000 UTC m=+232.054106321" watchObservedRunningTime="2026-02-24 10:20:06.941676238 +0000 UTC m=+232.055290479" Feb 24 10:20:06 crc kubenswrapper[4698]: I0224 10:20:06.973735 4698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-snxxh" podStartSLOduration=188.973720031 podStartE2EDuration="3m8.973720031s" podCreationTimestamp="2026-02-24 10:16:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:20:06.971833058 +0000 UTC m=+232.085447299" watchObservedRunningTime="2026-02-24 10:20:06.973720031 +0000 UTC m=+232.087334272" Feb 24 10:20:06 crc kubenswrapper[4698]: I0224 10:20:06.975217 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bmr2l\" (UID: \"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8\") " pod="openshift-image-registry/image-registry-697d97f7c8-bmr2l" Feb 24 10:20:06 crc kubenswrapper[4698]: E0224 10:20:06.975529 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:20:07.475518164 +0000 UTC m=+232.589132405 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bmr2l" (UID: "9ded6944-ff06-4cd5-beef-4dbb3cb9aba8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:07 crc kubenswrapper[4698]: I0224 10:20:07.018574 4698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-qzmkf" podStartSLOduration=189.018556913 podStartE2EDuration="3m9.018556913s" podCreationTimestamp="2026-02-24 10:16:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:20:07.018135893 +0000 UTC m=+232.131750144" watchObservedRunningTime="2026-02-24 10:20:07.018556913 +0000 UTC m=+232.132171154" Feb 24 10:20:07 crc kubenswrapper[4698]: I0224 10:20:07.054288 4698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-97zrr" podStartSLOduration=188.054268523 podStartE2EDuration="3m8.054268523s" podCreationTimestamp="2026-02-24 10:16:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:20:07.05200548 +0000 UTC m=+232.165619721" watchObservedRunningTime="2026-02-24 10:20:07.054268523 +0000 UTC m=+232.167882754" Feb 24 10:20:07 crc kubenswrapper[4698]: I0224 10:20:07.076683 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:20:07 crc kubenswrapper[4698]: E0224 10:20:07.076883 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:20:07.576855918 +0000 UTC m=+232.690470159 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:07 crc kubenswrapper[4698]: I0224 10:20:07.077166 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bmr2l\" (UID: \"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8\") " pod="openshift-image-registry/image-registry-697d97f7c8-bmr2l" Feb 24 10:20:07 crc kubenswrapper[4698]: E0224 10:20:07.077446 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:20:07.577439542 +0000 UTC m=+232.691053783 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bmr2l" (UID: "9ded6944-ff06-4cd5-beef-4dbb3cb9aba8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:07 crc kubenswrapper[4698]: I0224 10:20:07.177740 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:20:07 crc kubenswrapper[4698]: E0224 10:20:07.178154 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:20:07.67813683 +0000 UTC m=+232.791751071 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:07 crc kubenswrapper[4698]: I0224 10:20:07.279348 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bmr2l\" (UID: \"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8\") " pod="openshift-image-registry/image-registry-697d97f7c8-bmr2l" Feb 24 10:20:07 crc kubenswrapper[4698]: E0224 10:20:07.279700 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:20:07.77968445 +0000 UTC m=+232.893298691 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bmr2l" (UID: "9ded6944-ff06-4cd5-beef-4dbb3cb9aba8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:07 crc kubenswrapper[4698]: I0224 10:20:07.341674 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jzvrd" event={"ID":"f5c8edb8-fc4d-440e-94a0-116059aed6ad","Type":"ContainerStarted","Data":"72a4894b1d6d3a23cc83fbd7d869d7231c2a172eec3a9de33bc6bdffd45ea209"} Feb 24 10:20:07 crc kubenswrapper[4698]: I0224 10:20:07.343222 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-shwvb" event={"ID":"04421fdc-439e-4b78-b6ce-fcf8957ddf92","Type":"ContainerStarted","Data":"dce9727483384aff00357150af74ee407dfba249f7546875ee7375ea09a1d4af"} Feb 24 10:20:07 crc kubenswrapper[4698]: I0224 10:20:07.345813 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-skflh" event={"ID":"0afb3450-a53f-4b58-9472-7bbec9f4eb54","Type":"ContainerStarted","Data":"50e7abde15e79add8e5fb7055cea13d8446f346db75b34c68b3e8cd67867480a"} Feb 24 10:20:07 crc kubenswrapper[4698]: I0224 10:20:07.345858 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-skflh" event={"ID":"0afb3450-a53f-4b58-9472-7bbec9f4eb54","Type":"ContainerStarted","Data":"d6e422dec8f1431f1ae7a3bba1d0340adf043cafcfe98579356235615e605987"} Feb 24 10:20:07 crc kubenswrapper[4698]: I0224 10:20:07.347459 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gnxh7" event={"ID":"e7922021-adba-44fd-aff2-2f0776f3fabe","Type":"ContainerStarted","Data":"7c0e20dd1dc1b8348778b97bfa923863b747cabb8a82f3f6e99f34b7076b458f"} Feb 24 10:20:07 crc kubenswrapper[4698]: I0224 10:20:07.347502 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gnxh7" event={"ID":"e7922021-adba-44fd-aff2-2f0776f3fabe","Type":"ContainerStarted","Data":"90263cdb1e9473990dc98c21a261486370568713ded1db14570d8bc63239b232"} Feb 24 10:20:07 crc kubenswrapper[4698]: I0224 10:20:07.349584 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-z54pv" event={"ID":"50b7cda3-dd1c-4644-b5a6-23957a406b19","Type":"ContainerStarted","Data":"6ef40b11ed1e44b064b850b11f7c143400ed418fb27b6721847fd9ef7152225a"} Feb 24 10:20:07 crc kubenswrapper[4698]: I0224 10:20:07.353281 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-w8bmq" event={"ID":"38bdf14d-35ac-440b-9a16-9a4ddd53df34","Type":"ContainerStarted","Data":"25dda536ac674dae1f804711c11daa5514f71382564e0f5a04bed966cdb2995c"} Feb 24 10:20:07 crc kubenswrapper[4698]: I0224 10:20:07.353513 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-w8bmq" Feb 24 10:20:07 crc kubenswrapper[4698]: I0224 10:20:07.356417 4698 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-w8bmq container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Feb 24 10:20:07 crc kubenswrapper[4698]: I0224 10:20:07.356464 4698 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-w8bmq" podUID="38bdf14d-35ac-440b-9a16-9a4ddd53df34" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Feb 24 10:20:07 crc kubenswrapper[4698]: I0224 10:20:07.357190 4698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-shwvb" podStartSLOduration=188.35717811 podStartE2EDuration="3m8.35717811s" podCreationTimestamp="2026-02-24 10:16:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:20:07.356589697 +0000 UTC m=+232.470203948" watchObservedRunningTime="2026-02-24 10:20:07.35717811 +0000 UTC m=+232.470792351" Feb 24 10:20:07 crc kubenswrapper[4698]: I0224 10:20:07.361773 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-6kd89" event={"ID":"916d037d-f52e-449e-8496-34695060f8d5","Type":"ContainerStarted","Data":"1ce2792454d4bfa6692f893bb96ebf7bc846e128b2cb79d2eb9f96c7dae6b986"} Feb 24 10:20:07 crc kubenswrapper[4698]: I0224 10:20:07.364255 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29532135-2wtjd" event={"ID":"97c3a4a8-9e33-4012-9b16-5a0de6e0ace9","Type":"ContainerStarted","Data":"30070b246afa44ae2e4a30ef07801b0a6c263138ff561a7d5617b9095de13e61"} Feb 24 10:20:07 crc kubenswrapper[4698]: I0224 10:20:07.377613 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-w5v5m" event={"ID":"f688bfd3-2a09-4640-8ec8-9b69cc9881c4","Type":"ContainerStarted","Data":"49dd6243480e3260cf260c69b6bf7b4aeb18b8ec4d8fe47b68ad6c9998ecfac8"} Feb 24 10:20:07 crc kubenswrapper[4698]: I0224 10:20:07.377653 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-w5v5m" event={"ID":"f688bfd3-2a09-4640-8ec8-9b69cc9881c4","Type":"ContainerStarted","Data":"4997cf8c29af6612845bd90fa1d930f93fcf468df942a471b4f3aa0d1c4b9ec3"} Feb 24 10:20:07 crc kubenswrapper[4698]: I0224 10:20:07.380655 4698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-w8bmq" podStartSLOduration=189.380630285 podStartE2EDuration="3m9.380630285s" podCreationTimestamp="2026-02-24 10:16:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:20:07.374389411 +0000 UTC m=+232.488003652" watchObservedRunningTime="2026-02-24 10:20:07.380630285 +0000 UTC m=+232.494244546" Feb 24 10:20:07 crc kubenswrapper[4698]: I0224 10:20:07.381809 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:20:07 crc kubenswrapper[4698]: E0224 10:20:07.382340 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:20:07.882320504 +0000 UTC m=+232.995934755 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:07 crc kubenswrapper[4698]: I0224 10:20:07.391558 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-fq25r" event={"ID":"59c9844c-00fe-42cd-add6-9ab528da273d","Type":"ContainerStarted","Data":"8a8058da5db0ddd30db1d08f69ec4e9b30e98aac7cdeb33d03f7194dfbef85d3"} Feb 24 10:20:07 crc kubenswrapper[4698]: I0224 10:20:07.398948 4698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-skflh" podStartSLOduration=188.39893082 podStartE2EDuration="3m8.39893082s" podCreationTimestamp="2026-02-24 10:16:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:20:07.396910174 +0000 UTC m=+232.510524425" watchObservedRunningTime="2026-02-24 10:20:07.39893082 +0000 UTC m=+232.512545061" Feb 24 10:20:07 crc kubenswrapper[4698]: I0224 10:20:07.403569 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6h5bj" event={"ID":"4a2469de-c5ab-4a39-9168-01e03bd4b1c6","Type":"ContainerStarted","Data":"cd63913e86736402a50ed20d5fd07d546cdbb81561b7956cef7471e652085019"} Feb 24 10:20:07 crc kubenswrapper[4698]: I0224 10:20:07.403620 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6h5bj" event={"ID":"4a2469de-c5ab-4a39-9168-01e03bd4b1c6","Type":"ContainerStarted","Data":"f6d73b8c8c99f5461ba98f77d0bb92421e404b373743eefdd694d52c86a41c6c"} Feb 24 10:20:07 crc kubenswrapper[4698]: I0224 10:20:07.404452 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-6h5bj" Feb 24 10:20:07 crc kubenswrapper[4698]: I0224 10:20:07.411691 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-lcjcd" event={"ID":"50f7a0ea-7b15-487b-b907-6fb4c7451eed","Type":"ContainerStarted","Data":"0e7f69bae45b5c17b65187ff0d92b1181ace2c8b59a2b67a924075c3aa0af485"} Feb 24 10:20:07 crc kubenswrapper[4698]: I0224 10:20:07.412530 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-lcjcd" Feb 24 10:20:07 crc kubenswrapper[4698]: I0224 10:20:07.421252 4698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-gnxh7" podStartSLOduration=188.421232669 podStartE2EDuration="3m8.421232669s" podCreationTimestamp="2026-02-24 10:16:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:20:07.419501799 +0000 UTC m=+232.533116040" watchObservedRunningTime="2026-02-24 10:20:07.421232669 +0000 UTC m=+232.534846930" Feb 24 10:20:07 crc kubenswrapper[4698]: I0224 10:20:07.428910 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-f69tr" event={"ID":"c45160a9-f0cb-4b39-ad29-67c14871973f","Type":"ContainerStarted","Data":"1df1c04e1d04424a7e7d3321556bed132fc90490cee8f10282d579174dfafa82"} Feb 24 10:20:07 crc kubenswrapper[4698]: I0224 10:20:07.434737 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bz5kc" event={"ID":"b681f586-b4e0-4b2a-ab97-ea20583eeb34","Type":"ContainerStarted","Data":"e260ddc0639ed2e61594bf70d145a2478855774e2287dee7d0fe039c555f8d21"} Feb 24 10:20:07 crc kubenswrapper[4698]: I0224 10:20:07.435562 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bz5kc" Feb 24 10:20:07 crc kubenswrapper[4698]: I0224 10:20:07.437624 4698 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-bz5kc container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.39:8443/healthz\": dial tcp 10.217.0.39:8443: connect: connection refused" start-of-body= Feb 24 10:20:07 crc kubenswrapper[4698]: I0224 10:20:07.437670 4698 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bz5kc" podUID="b681f586-b4e0-4b2a-ab97-ea20583eeb34" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.39:8443/healthz\": dial tcp 10.217.0.39:8443: connect: connection refused" Feb 24 10:20:07 crc kubenswrapper[4698]: I0224 10:20:07.450855 4698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-z54pv" podStartSLOduration=188.450838506 podStartE2EDuration="3m8.450838506s" podCreationTimestamp="2026-02-24 10:16:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:20:07.448706957 +0000 UTC m=+232.562321198" watchObservedRunningTime="2026-02-24 10:20:07.450838506 +0000 UTC m=+232.564452747" Feb 24 10:20:07 crc kubenswrapper[4698]: I0224 10:20:07.453491 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-v6qbj" event={"ID":"312007fb-fd23-4a36-b653-ea3e24a02ee0","Type":"ContainerStarted","Data":"3314bf042e12bb5de5e778292be0056106867119382d4121530648d4c8f5f421"} Feb 24 10:20:07 crc kubenswrapper[4698]: I0224 10:20:07.455801 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5582h" event={"ID":"875da7ec-7eeb-4f5c-b849-73863732ebb2","Type":"ContainerStarted","Data":"901c67ee54530450b8ca85ce48522f33cf078eaa57a1f5b5b79a7caf6871d477"} Feb 24 10:20:07 crc kubenswrapper[4698]: I0224 10:20:07.461957 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-vb7wk" event={"ID":"fccccb67-888f-4a34-a701-61926e9819a6","Type":"ContainerStarted","Data":"eaf4d5a6acc3df4a9de5cc3ba0fe8677d5a94b12ee75723d333981a1228e3c46"} Feb 24 10:20:07 crc kubenswrapper[4698]: I0224 10:20:07.484216 4698 ???:1] "http: TLS handshake error from 192.168.126.11:55170: no serving certificate available for the kubelet" Feb 24 10:20:07 crc kubenswrapper[4698]: I0224 10:20:07.484463 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-lqzfp" event={"ID":"568f96c6-6a68-4e06-a1e1-1b787f58bac7","Type":"ContainerStarted","Data":"d4f15d175f7af1fce0f3b9c83f247a4dac4e1f172bae3b37bb17e785be7b3b59"} Feb 24 10:20:07 crc kubenswrapper[4698]: I0224 10:20:07.484943 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bmr2l\" (UID: \"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8\") " pod="openshift-image-registry/image-registry-697d97f7c8-bmr2l" Feb 24 10:20:07 crc kubenswrapper[4698]: I0224 10:20:07.485888 4698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-f69tr" podStartSLOduration=188.48587235 podStartE2EDuration="3m8.48587235s" podCreationTimestamp="2026-02-24 10:16:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:20:07.484394276 +0000 UTC m=+232.598008517" watchObservedRunningTime="2026-02-24 10:20:07.48587235 +0000 UTC m=+232.599486581" Feb 24 10:20:07 crc kubenswrapper[4698]: E0224 10:20:07.486349 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:20:07.986336272 +0000 UTC m=+233.099950513 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bmr2l" (UID: "9ded6944-ff06-4cd5-beef-4dbb3cb9aba8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:07 crc kubenswrapper[4698]: I0224 10:20:07.487674 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-79f62" event={"ID":"d0de08e0-63c0-4a90-a264-1bc41b8746d8","Type":"ContainerStarted","Data":"186142add921ff86958b67725d353a8e19fd4648ca6aa0e8301a4a1c96f0a1cd"} Feb 24 10:20:07 crc kubenswrapper[4698]: I0224 10:20:07.488373 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-79f62" Feb 24 10:20:07 crc kubenswrapper[4698]: I0224 10:20:07.495670 4698 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-79f62 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/healthz\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Feb 24 10:20:07 crc kubenswrapper[4698]: I0224 10:20:07.495730 4698 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-79f62" podUID="d0de08e0-63c0-4a90-a264-1bc41b8746d8" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.25:8080/healthz\": dial tcp 10.217.0.25:8080: connect: connection refused" Feb 24 10:20:07 crc kubenswrapper[4698]: I0224 10:20:07.500826 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-csv7z" event={"ID":"aed98b22-f91a-4aba-ab64-65fc09af1478","Type":"ContainerStarted","Data":"69ba5329ddf2f9d1a63f6469f7ecf2746f3e7b7118fa5b30cc0407044b6a1230"} Feb 24 10:20:07 crc kubenswrapper[4698]: I0224 10:20:07.514632 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rkxpp" event={"ID":"e3a102f6-2a75-4096-806a-7af5eca816e0","Type":"ContainerStarted","Data":"139ddc0b39d022d4e7ee0f1b0fed2d998dfb08a5bc4fe4308fe8a5d67edb5d97"} Feb 24 10:20:07 crc kubenswrapper[4698]: I0224 10:20:07.515122 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rkxpp" Feb 24 10:20:07 crc kubenswrapper[4698]: I0224 10:20:07.518878 4698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bz5kc" podStartSLOduration=188.518862697 podStartE2EDuration="3m8.518862697s" podCreationTimestamp="2026-02-24 10:16:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:20:07.515094989 +0000 UTC m=+232.628709230" watchObservedRunningTime="2026-02-24 10:20:07.518862697 +0000 UTC m=+232.632476938" Feb 24 10:20:07 crc kubenswrapper[4698]: I0224 10:20:07.523407 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xcfhs" event={"ID":"b07a9333-815e-464f-afc6-28c1da857d84","Type":"ContainerStarted","Data":"afbf079adcd951d1f7120d372f31516330dea1bbc27df46d6cae4103c53d50b7"} Feb 24 10:20:07 crc kubenswrapper[4698]: I0224 10:20:07.523444 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xcfhs" event={"ID":"b07a9333-815e-464f-afc6-28c1da857d84","Type":"ContainerStarted","Data":"8b479a471036daa00b4ba087bdb66fd8666a246f81447a8718e0d7a294ab1135"} Feb 24 10:20:07 crc kubenswrapper[4698]: I0224 10:20:07.539682 4698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-fq25r" podStartSLOduration=189.539663671 podStartE2EDuration="3m9.539663671s" podCreationTimestamp="2026-02-24 10:16:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:20:07.537574282 +0000 UTC m=+232.651188533" watchObservedRunningTime="2026-02-24 10:20:07.539663671 +0000 UTC m=+232.653277912" Feb 24 10:20:07 crc kubenswrapper[4698]: I0224 10:20:07.541941 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-fd5xc" event={"ID":"29e7e4f9-c6e0-4a3a-8ec6-5c863c192667","Type":"ContainerStarted","Data":"9f7978ad81e8271496c4f76712a670c095877d37338f4b348db7df4d81029cb8"} Feb 24 10:20:07 crc kubenswrapper[4698]: I0224 10:20:07.542241 4698 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-bfmpl container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Feb 24 10:20:07 crc kubenswrapper[4698]: I0224 10:20:07.542304 4698 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bfmpl" podUID="7141f48f-7462-4e1a-a90f-b8ff3b9a8d9f" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" Feb 24 10:20:07 crc kubenswrapper[4698]: I0224 10:20:07.542844 4698 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-sn5g8 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.40:8443/healthz\": dial tcp 10.217.0.40:8443: connect: connection refused" start-of-body= Feb 24 10:20:07 crc kubenswrapper[4698]: I0224 10:20:07.542975 4698 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sn5g8" podUID="bbfa8949-aef4-4d80-8ece-7af18d74a9a0" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.40:8443/healthz\": dial tcp 10.217.0.40:8443: connect: connection refused" Feb 24 10:20:07 crc kubenswrapper[4698]: I0224 10:20:07.542896 4698 patch_prober.go:28] interesting pod/console-operator-58897d9998-qzmkf container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.19:8443/readyz\": dial tcp 10.217.0.19:8443: connect: connection refused" start-of-body= Feb 24 10:20:07 crc kubenswrapper[4698]: I0224 10:20:07.543129 4698 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-qzmkf" podUID="9c9967ed-20af-48cf-859d-4c3060d413fb" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.19:8443/readyz\": dial tcp 10.217.0.19:8443: connect: connection refused" Feb 24 10:20:07 crc kubenswrapper[4698]: I0224 10:20:07.543346 4698 patch_prober.go:28] interesting pod/downloads-7954f5f757-z42jf container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" start-of-body= Feb 24 10:20:07 crc kubenswrapper[4698]: I0224 10:20:07.543391 4698 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-z42jf" podUID="108d72f5-0dd9-4965-a41f-7403ad8fce04" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" Feb 24 10:20:07 crc kubenswrapper[4698]: I0224 10:20:07.543706 4698 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-5tdjl container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.30:5443/healthz\": dial tcp 10.217.0.30:5443: connect: connection refused" start-of-body= Feb 24 10:20:07 crc kubenswrapper[4698]: I0224 10:20:07.544045 4698 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5tdjl" podUID="c230438c-2633-4e31-b0da-b1d037e35e0c" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.30:5443/healthz\": dial tcp 10.217.0.30:5443: connect: connection refused" Feb 24 10:20:07 crc kubenswrapper[4698]: I0224 10:20:07.546182 4698 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-hxxxs container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.26:6443/healthz\": dial tcp 10.217.0.26:6443: connect: connection refused" start-of-body= Feb 24 10:20:07 crc kubenswrapper[4698]: I0224 10:20:07.546222 4698 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-hxxxs" podUID="803e0d1c-f298-49b4-9251-9271f311ee92" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.26:6443/healthz\": dial tcp 10.217.0.26:6443: connect: connection refused" Feb 24 10:20:07 crc kubenswrapper[4698]: I0224 10:20:07.568451 4698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-6kd89" podStartSLOduration=7.568427428 podStartE2EDuration="7.568427428s" podCreationTimestamp="2026-02-24 10:20:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:20:07.559988382 +0000 UTC m=+232.673602623" watchObservedRunningTime="2026-02-24 10:20:07.568427428 +0000 UTC m=+232.682041659" Feb 24 10:20:07 crc kubenswrapper[4698]: I0224 10:20:07.584573 4698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-6h5bj" podStartSLOduration=7.584557364 podStartE2EDuration="7.584557364s" podCreationTimestamp="2026-02-24 10:20:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:20:07.583784975 +0000 UTC m=+232.697399216" watchObservedRunningTime="2026-02-24 10:20:07.584557364 +0000 UTC m=+232.698171605" Feb 24 10:20:07 crc kubenswrapper[4698]: I0224 10:20:07.586151 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:20:07 crc kubenswrapper[4698]: E0224 10:20:07.587290 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:20:08.087275077 +0000 UTC m=+233.200889318 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:07 crc kubenswrapper[4698]: I0224 10:20:07.634769 4698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-w5v5m" podStartSLOduration=188.6347503 podStartE2EDuration="3m8.6347503s" podCreationTimestamp="2026-02-24 10:16:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:20:07.600506994 +0000 UTC m=+232.714121235" watchObservedRunningTime="2026-02-24 10:20:07.6347503 +0000 UTC m=+232.748364561" Feb 24 10:20:07 crc kubenswrapper[4698]: I0224 10:20:07.657761 4698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-lcjcd" podStartSLOduration=189.657743934 podStartE2EDuration="3m9.657743934s" podCreationTimestamp="2026-02-24 10:16:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:20:07.657177201 +0000 UTC m=+232.770791442" watchObservedRunningTime="2026-02-24 10:20:07.657743934 +0000 UTC m=+232.771358185" Feb 24 10:20:07 crc kubenswrapper[4698]: I0224 10:20:07.660010 4698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29532135-2wtjd" podStartSLOduration=189.659999966 podStartE2EDuration="3m9.659999966s" podCreationTimestamp="2026-02-24 10:16:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:20:07.632599789 +0000 UTC m=+232.746214030" watchObservedRunningTime="2026-02-24 10:20:07.659999966 +0000 UTC m=+232.773614207" Feb 24 10:20:07 crc kubenswrapper[4698]: I0224 10:20:07.688167 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bmr2l\" (UID: \"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8\") " pod="openshift-image-registry/image-registry-697d97f7c8-bmr2l" Feb 24 10:20:07 crc kubenswrapper[4698]: E0224 10:20:07.690089 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:20:08.190060784 +0000 UTC m=+233.303675125 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bmr2l" (UID: "9ded6944-ff06-4cd5-beef-4dbb3cb9aba8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:07 crc kubenswrapper[4698]: I0224 10:20:07.702881 4698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-5582h" podStartSLOduration=188.702865692 podStartE2EDuration="3m8.702865692s" podCreationTimestamp="2026-02-24 10:16:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:20:07.702152476 +0000 UTC m=+232.815766737" watchObservedRunningTime="2026-02-24 10:20:07.702865692 +0000 UTC m=+232.816479933" Feb 24 10:20:07 crc kubenswrapper[4698]: I0224 10:20:07.703516 4698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-79f62" podStartSLOduration=188.703511627 podStartE2EDuration="3m8.703511627s" podCreationTimestamp="2026-02-24 10:16:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:20:07.682129 +0000 UTC m=+232.795743281" watchObservedRunningTime="2026-02-24 10:20:07.703511627 +0000 UTC m=+232.817125868" Feb 24 10:20:07 crc kubenswrapper[4698]: I0224 10:20:07.717803 4698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-v6qbj" podStartSLOduration=189.717782449 podStartE2EDuration="3m9.717782449s" podCreationTimestamp="2026-02-24 10:16:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:20:07.717617025 +0000 UTC m=+232.831231266" watchObservedRunningTime="2026-02-24 10:20:07.717782449 +0000 UTC m=+232.831396690" Feb 24 10:20:07 crc kubenswrapper[4698]: I0224 10:20:07.744164 4698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rkxpp" podStartSLOduration=188.744144901 podStartE2EDuration="3m8.744144901s" podCreationTimestamp="2026-02-24 10:16:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:20:07.742606966 +0000 UTC m=+232.856221227" watchObservedRunningTime="2026-02-24 10:20:07.744144901 +0000 UTC m=+232.857759142" Feb 24 10:20:07 crc kubenswrapper[4698]: I0224 10:20:07.791150 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:20:07 crc kubenswrapper[4698]: E0224 10:20:07.791555 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:20:08.291536473 +0000 UTC m=+233.405150714 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:07 crc kubenswrapper[4698]: I0224 10:20:07.815137 4698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-vb7wk" podStartSLOduration=188.81511975 podStartE2EDuration="3m8.81511975s" podCreationTimestamp="2026-02-24 10:16:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:20:07.813330048 +0000 UTC m=+232.926944299" watchObservedRunningTime="2026-02-24 10:20:07.81511975 +0000 UTC m=+232.928733991" Feb 24 10:20:07 crc kubenswrapper[4698]: I0224 10:20:07.816197 4698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-xcfhs" podStartSLOduration=189.816191715 podStartE2EDuration="3m9.816191715s" podCreationTimestamp="2026-02-24 10:16:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:20:07.781232813 +0000 UTC m=+232.894847064" watchObservedRunningTime="2026-02-24 10:20:07.816191715 +0000 UTC m=+232.929805956" Feb 24 10:20:07 crc kubenswrapper[4698]: I0224 10:20:07.893061 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bmr2l\" (UID: \"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8\") " pod="openshift-image-registry/image-registry-697d97f7c8-bmr2l" Feb 24 10:20:07 crc kubenswrapper[4698]: E0224 10:20:07.893623 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:20:08.393423249 +0000 UTC m=+233.507037490 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bmr2l" (UID: "9ded6944-ff06-4cd5-beef-4dbb3cb9aba8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:07 crc kubenswrapper[4698]: I0224 10:20:07.901511 4698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-fd5xc" podStartSLOduration=188.901493237 podStartE2EDuration="3m8.901493237s" podCreationTimestamp="2026-02-24 10:16:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:20:07.901300922 +0000 UTC m=+233.014915163" watchObservedRunningTime="2026-02-24 10:20:07.901493237 +0000 UTC m=+233.015107468" Feb 24 10:20:07 crc kubenswrapper[4698]: I0224 10:20:07.903097 4698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-csv7z" podStartSLOduration=188.903089284 podStartE2EDuration="3m8.903089284s" podCreationTimestamp="2026-02-24 10:16:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:20:07.857462744 +0000 UTC m=+232.971077005" watchObservedRunningTime="2026-02-24 10:20:07.903089284 +0000 UTC m=+233.016703515" Feb 24 10:20:07 crc kubenswrapper[4698]: I0224 10:20:07.913164 4698 patch_prober.go:28] interesting pod/router-default-5444994796-trk7h container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 24 10:20:07 crc kubenswrapper[4698]: [-]has-synced failed: reason withheld Feb 24 10:20:07 crc kubenswrapper[4698]: [+]process-running ok Feb 24 10:20:07 crc kubenswrapper[4698]: healthz check failed Feb 24 10:20:07 crc kubenswrapper[4698]: I0224 10:20:07.913225 4698 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-trk7h" podUID="b44b8c72-3ca2-4fbe-aa3f-9ab7917b1658" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 24 10:20:07 crc kubenswrapper[4698]: I0224 10:20:07.994913 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:20:07 crc kubenswrapper[4698]: E0224 10:20:07.995084 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:20:08.495058421 +0000 UTC m=+233.608672662 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:07 crc kubenswrapper[4698]: I0224 10:20:07.995201 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bmr2l\" (UID: \"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8\") " pod="openshift-image-registry/image-registry-697d97f7c8-bmr2l" Feb 24 10:20:07 crc kubenswrapper[4698]: E0224 10:20:07.995607 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:20:08.495596293 +0000 UTC m=+233.609210534 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bmr2l" (UID: "9ded6944-ff06-4cd5-beef-4dbb3cb9aba8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:08 crc kubenswrapper[4698]: I0224 10:20:08.096876 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:20:08 crc kubenswrapper[4698]: E0224 10:20:08.097146 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:20:08.597122312 +0000 UTC m=+233.710736553 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:08 crc kubenswrapper[4698]: I0224 10:20:08.097442 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bmr2l\" (UID: \"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8\") " pod="openshift-image-registry/image-registry-697d97f7c8-bmr2l" Feb 24 10:20:08 crc kubenswrapper[4698]: E0224 10:20:08.097822 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:20:08.597811918 +0000 UTC m=+233.711426159 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bmr2l" (UID: "9ded6944-ff06-4cd5-beef-4dbb3cb9aba8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:08 crc kubenswrapper[4698]: I0224 10:20:08.198330 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:20:08 crc kubenswrapper[4698]: E0224 10:20:08.198653 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:20:08.6986387 +0000 UTC m=+233.812252931 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:08 crc kubenswrapper[4698]: I0224 10:20:08.299855 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bmr2l\" (UID: \"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8\") " pod="openshift-image-registry/image-registry-697d97f7c8-bmr2l" Feb 24 10:20:08 crc kubenswrapper[4698]: E0224 10:20:08.300128 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:20:08.800117188 +0000 UTC m=+233.913731429 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bmr2l" (UID: "9ded6944-ff06-4cd5-beef-4dbb3cb9aba8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:08 crc kubenswrapper[4698]: I0224 10:20:08.401540 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:20:08 crc kubenswrapper[4698]: E0224 10:20:08.401720 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:20:08.901695828 +0000 UTC m=+234.015310069 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:08 crc kubenswrapper[4698]: I0224 10:20:08.402152 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bmr2l\" (UID: \"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8\") " pod="openshift-image-registry/image-registry-697d97f7c8-bmr2l" Feb 24 10:20:08 crc kubenswrapper[4698]: E0224 10:20:08.402516 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:20:08.902504357 +0000 UTC m=+234.016118698 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bmr2l" (UID: "9ded6944-ff06-4cd5-beef-4dbb3cb9aba8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:08 crc kubenswrapper[4698]: I0224 10:20:08.503323 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:20:08 crc kubenswrapper[4698]: E0224 10:20:08.503709 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:20:09.003679897 +0000 UTC m=+234.117294138 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:08 crc kubenswrapper[4698]: I0224 10:20:08.553001 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jzvrd" event={"ID":"f5c8edb8-fc4d-440e-94a0-116059aed6ad","Type":"ContainerStarted","Data":"bc107f54061a74fdf4fb307a53bd79876ad96cfae8663a48f7a3c58723205072"} Feb 24 10:20:08 crc kubenswrapper[4698]: I0224 10:20:08.567409 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-fd5xc" event={"ID":"29e7e4f9-c6e0-4a3a-8ec6-5c863c192667","Type":"ContainerStarted","Data":"4752fbc563c0fb3af61982d39931c389f48572817e8d6b310041b22096043f56"} Feb 24 10:20:08 crc kubenswrapper[4698]: I0224 10:20:08.571505 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-lqzfp" event={"ID":"568f96c6-6a68-4e06-a1e1-1b787f58bac7","Type":"ContainerStarted","Data":"24d2865179303f05e1c69f81a3150b11754864a6757541e3c3fd5469452c1bdc"} Feb 24 10:20:08 crc kubenswrapper[4698]: I0224 10:20:08.572041 4698 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-79f62 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/healthz\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Feb 24 10:20:08 crc kubenswrapper[4698]: I0224 10:20:08.572081 4698 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-79f62" podUID="d0de08e0-63c0-4a90-a264-1bc41b8746d8" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.25:8080/healthz\": dial tcp 10.217.0.25:8080: connect: connection refused" Feb 24 10:20:08 crc kubenswrapper[4698]: I0224 10:20:08.572284 4698 patch_prober.go:28] interesting pod/downloads-7954f5f757-z42jf container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" start-of-body= Feb 24 10:20:08 crc kubenswrapper[4698]: I0224 10:20:08.572044 4698 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-bz5kc container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.39:8443/healthz\": dial tcp 10.217.0.39:8443: connect: connection refused" start-of-body= Feb 24 10:20:08 crc kubenswrapper[4698]: I0224 10:20:08.572320 4698 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-z42jf" podUID="108d72f5-0dd9-4965-a41f-7403ad8fce04" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" Feb 24 10:20:08 crc kubenswrapper[4698]: I0224 10:20:08.572329 4698 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bz5kc" podUID="b681f586-b4e0-4b2a-ab97-ea20583eeb34" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.39:8443/healthz\": dial tcp 10.217.0.39:8443: connect: connection refused" Feb 24 10:20:08 crc kubenswrapper[4698]: I0224 10:20:08.572455 4698 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-w8bmq container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Feb 24 10:20:08 crc kubenswrapper[4698]: I0224 10:20:08.572499 4698 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-w8bmq" podUID="38bdf14d-35ac-440b-9a16-9a4ddd53df34" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Feb 24 10:20:08 crc kubenswrapper[4698]: I0224 10:20:08.572867 4698 patch_prober.go:28] interesting pod/console-operator-58897d9998-qzmkf container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.19:8443/readyz\": dial tcp 10.217.0.19:8443: connect: connection refused" start-of-body= Feb 24 10:20:08 crc kubenswrapper[4698]: I0224 10:20:08.572921 4698 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-qzmkf" podUID="9c9967ed-20af-48cf-859d-4c3060d413fb" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.19:8443/readyz\": dial tcp 10.217.0.19:8443: connect: connection refused" Feb 24 10:20:08 crc kubenswrapper[4698]: I0224 10:20:08.608181 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bmr2l\" (UID: \"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8\") " pod="openshift-image-registry/image-registry-697d97f7c8-bmr2l" Feb 24 10:20:08 crc kubenswrapper[4698]: E0224 10:20:08.608561 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:20:09.108542794 +0000 UTC m=+234.222157035 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bmr2l" (UID: "9ded6944-ff06-4cd5-beef-4dbb3cb9aba8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:08 crc kubenswrapper[4698]: I0224 10:20:08.630463 4698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jzvrd" podStartSLOduration=189.630449143 podStartE2EDuration="3m9.630449143s" podCreationTimestamp="2026-02-24 10:16:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:20:08.629326477 +0000 UTC m=+233.742940718" watchObservedRunningTime="2026-02-24 10:20:08.630449143 +0000 UTC m=+233.744063374" Feb 24 10:20:08 crc kubenswrapper[4698]: I0224 10:20:08.631974 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-sn5g8" Feb 24 10:20:08 crc kubenswrapper[4698]: I0224 10:20:08.709077 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:20:08 crc kubenswrapper[4698]: E0224 10:20:08.711615 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:20:09.211599038 +0000 UTC m=+234.325213279 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:08 crc kubenswrapper[4698]: I0224 10:20:08.714479 4698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-lqzfp" podStartSLOduration=190.714463125 podStartE2EDuration="3m10.714463125s" podCreationTimestamp="2026-02-24 10:16:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:20:08.680112806 +0000 UTC m=+233.793727047" watchObservedRunningTime="2026-02-24 10:20:08.714463125 +0000 UTC m=+233.828077366" Feb 24 10:20:08 crc kubenswrapper[4698]: I0224 10:20:08.811496 4698 ???:1] "http: TLS handshake error from 192.168.126.11:55172: no serving certificate available for the kubelet" Feb 24 10:20:08 crc kubenswrapper[4698]: I0224 10:20:08.811699 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bmr2l\" (UID: \"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8\") " pod="openshift-image-registry/image-registry-697d97f7c8-bmr2l" Feb 24 10:20:08 crc kubenswrapper[4698]: E0224 10:20:08.812132 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:20:09.312116773 +0000 UTC m=+234.425731014 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bmr2l" (UID: "9ded6944-ff06-4cd5-beef-4dbb3cb9aba8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:08 crc kubenswrapper[4698]: I0224 10:20:08.872036 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-5tdjl" Feb 24 10:20:08 crc kubenswrapper[4698]: I0224 10:20:08.911432 4698 patch_prober.go:28] interesting pod/router-default-5444994796-trk7h container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 24 10:20:08 crc kubenswrapper[4698]: [-]has-synced failed: reason withheld Feb 24 10:20:08 crc kubenswrapper[4698]: [+]process-running ok Feb 24 10:20:08 crc kubenswrapper[4698]: healthz check failed Feb 24 10:20:08 crc kubenswrapper[4698]: I0224 10:20:08.911529 4698 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-trk7h" podUID="b44b8c72-3ca2-4fbe-aa3f-9ab7917b1658" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 24 10:20:08 crc kubenswrapper[4698]: I0224 10:20:08.912333 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:20:08 crc kubenswrapper[4698]: E0224 10:20:08.912596 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:20:09.412582707 +0000 UTC m=+234.526196938 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:08 crc kubenswrapper[4698]: I0224 10:20:08.912724 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bmr2l\" (UID: \"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8\") " pod="openshift-image-registry/image-registry-697d97f7c8-bmr2l" Feb 24 10:20:08 crc kubenswrapper[4698]: E0224 10:20:08.913039 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:20:09.413031658 +0000 UTC m=+234.526645900 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bmr2l" (UID: "9ded6944-ff06-4cd5-beef-4dbb3cb9aba8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:09 crc kubenswrapper[4698]: I0224 10:20:09.013465 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:20:09 crc kubenswrapper[4698]: E0224 10:20:09.013690 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:20:09.513659036 +0000 UTC m=+234.627273287 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:09 crc kubenswrapper[4698]: I0224 10:20:09.013948 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bmr2l\" (UID: \"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8\") " pod="openshift-image-registry/image-registry-697d97f7c8-bmr2l" Feb 24 10:20:09 crc kubenswrapper[4698]: E0224 10:20:09.014334 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:20:09.514304551 +0000 UTC m=+234.627918782 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bmr2l" (UID: "9ded6944-ff06-4cd5-beef-4dbb3cb9aba8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:09 crc kubenswrapper[4698]: I0224 10:20:09.114727 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:20:09 crc kubenswrapper[4698]: E0224 10:20:09.114950 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:20:09.614925169 +0000 UTC m=+234.728539410 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:09 crc kubenswrapper[4698]: I0224 10:20:09.216081 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bmr2l\" (UID: \"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8\") " pod="openshift-image-registry/image-registry-697d97f7c8-bmr2l" Feb 24 10:20:09 crc kubenswrapper[4698]: E0224 10:20:09.216452 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:20:09.716437267 +0000 UTC m=+234.830051508 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bmr2l" (UID: "9ded6944-ff06-4cd5-beef-4dbb3cb9aba8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:09 crc kubenswrapper[4698]: I0224 10:20:09.317487 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:20:09 crc kubenswrapper[4698]: E0224 10:20:09.317783 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:20:09.817734211 +0000 UTC m=+234.931348462 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:09 crc kubenswrapper[4698]: I0224 10:20:09.418920 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bmr2l\" (UID: \"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8\") " pod="openshift-image-registry/image-registry-697d97f7c8-bmr2l" Feb 24 10:20:09 crc kubenswrapper[4698]: E0224 10:20:09.419349 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:20:09.919330771 +0000 UTC m=+235.032945052 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bmr2l" (UID: "9ded6944-ff06-4cd5-beef-4dbb3cb9aba8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:09 crc kubenswrapper[4698]: I0224 10:20:09.520471 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:20:09 crc kubenswrapper[4698]: E0224 10:20:09.520682 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:20:10.020656945 +0000 UTC m=+235.134271176 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:09 crc kubenswrapper[4698]: I0224 10:20:09.520877 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bmr2l\" (UID: \"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8\") " pod="openshift-image-registry/image-registry-697d97f7c8-bmr2l" Feb 24 10:20:09 crc kubenswrapper[4698]: E0224 10:20:09.521180 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:20:10.021167627 +0000 UTC m=+235.134781868 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bmr2l" (UID: "9ded6944-ff06-4cd5-beef-4dbb3cb9aba8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:09 crc kubenswrapper[4698]: I0224 10:20:09.579027 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-fbt6k" event={"ID":"19c1f910-f805-4151-8eb8-7a6628a62b5b","Type":"ContainerStarted","Data":"0059a607abc6d53a5d150196ea5bd5d4b4222667ab5bb544450e76f73fdf962a"} Feb 24 10:20:09 crc kubenswrapper[4698]: I0224 10:20:09.579919 4698 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-79f62 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/healthz\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Feb 24 10:20:09 crc kubenswrapper[4698]: I0224 10:20:09.579960 4698 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-79f62" podUID="d0de08e0-63c0-4a90-a264-1bc41b8746d8" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.25:8080/healthz\": dial tcp 10.217.0.25:8080: connect: connection refused" Feb 24 10:20:09 crc kubenswrapper[4698]: I0224 10:20:09.599502 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bz5kc" Feb 24 10:20:09 crc kubenswrapper[4698]: I0224 10:20:09.621353 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:20:09 crc kubenswrapper[4698]: E0224 10:20:09.621706 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:20:10.121691843 +0000 UTC m=+235.235306084 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:09 crc kubenswrapper[4698]: I0224 10:20:09.723448 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bmr2l\" (UID: \"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8\") " pod="openshift-image-registry/image-registry-697d97f7c8-bmr2l" Feb 24 10:20:09 crc kubenswrapper[4698]: E0224 10:20:09.723950 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:20:10.223930578 +0000 UTC m=+235.337544819 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bmr2l" (UID: "9ded6944-ff06-4cd5-beef-4dbb3cb9aba8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:09 crc kubenswrapper[4698]: I0224 10:20:09.735895 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:20:09 crc kubenswrapper[4698]: I0224 10:20:09.825883 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:20:09 crc kubenswrapper[4698]: E0224 10:20:09.826053 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:20:10.32602183 +0000 UTC m=+235.439636081 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:09 crc kubenswrapper[4698]: I0224 10:20:09.826235 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bmr2l\" (UID: \"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8\") " pod="openshift-image-registry/image-registry-697d97f7c8-bmr2l" Feb 24 10:20:09 crc kubenswrapper[4698]: E0224 10:20:09.826607 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:20:10.326591553 +0000 UTC m=+235.440205784 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bmr2l" (UID: "9ded6944-ff06-4cd5-beef-4dbb3cb9aba8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:09 crc kubenswrapper[4698]: I0224 10:20:09.892289 4698 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-w8bmq"] Feb 24 10:20:09 crc kubenswrapper[4698]: I0224 10:20:09.892681 4698 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-w8bmq" podUID="38bdf14d-35ac-440b-9a16-9a4ddd53df34" containerName="controller-manager" containerID="cri-o://25dda536ac674dae1f804711c11daa5514f71382564e0f5a04bed966cdb2995c" gracePeriod=30 Feb 24 10:20:09 crc kubenswrapper[4698]: I0224 10:20:09.894348 4698 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-bfmpl"] Feb 24 10:20:09 crc kubenswrapper[4698]: I0224 10:20:09.894553 4698 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bfmpl" podUID="7141f48f-7462-4e1a-a90f-b8ff3b9a8d9f" containerName="route-controller-manager" containerID="cri-o://42c76dca03b0dc1d84b11b5d0a93719eaccc1e2cfed76f09c96ba776a1105642" gracePeriod=30 Feb 24 10:20:09 crc kubenswrapper[4698]: I0224 10:20:09.905238 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bfmpl" Feb 24 10:20:09 crc kubenswrapper[4698]: I0224 10:20:09.906192 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-w8bmq" Feb 24 10:20:09 crc kubenswrapper[4698]: I0224 10:20:09.911198 4698 patch_prober.go:28] interesting pod/router-default-5444994796-trk7h container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 24 10:20:09 crc kubenswrapper[4698]: [-]has-synced failed: reason withheld Feb 24 10:20:09 crc kubenswrapper[4698]: [+]process-running ok Feb 24 10:20:09 crc kubenswrapper[4698]: healthz check failed Feb 24 10:20:09 crc kubenswrapper[4698]: I0224 10:20:09.911240 4698 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-trk7h" podUID="b44b8c72-3ca2-4fbe-aa3f-9ab7917b1658" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 24 10:20:09 crc kubenswrapper[4698]: I0224 10:20:09.927148 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:20:09 crc kubenswrapper[4698]: E0224 10:20:09.927351 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:20:10.427327723 +0000 UTC m=+235.540941964 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:09 crc kubenswrapper[4698]: I0224 10:20:09.927462 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bmr2l\" (UID: \"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8\") " pod="openshift-image-registry/image-registry-697d97f7c8-bmr2l" Feb 24 10:20:09 crc kubenswrapper[4698]: E0224 10:20:09.927732 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:20:10.427720922 +0000 UTC m=+235.541335163 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bmr2l" (UID: "9ded6944-ff06-4cd5-beef-4dbb3cb9aba8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:10 crc kubenswrapper[4698]: I0224 10:20:10.028696 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:20:10 crc kubenswrapper[4698]: E0224 10:20:10.029038 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:20:10.529025037 +0000 UTC m=+235.642639278 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:10 crc kubenswrapper[4698]: I0224 10:20:10.130453 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bmr2l\" (UID: \"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8\") " pod="openshift-image-registry/image-registry-697d97f7c8-bmr2l" Feb 24 10:20:10 crc kubenswrapper[4698]: E0224 10:20:10.130949 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:20:10.630934024 +0000 UTC m=+235.744548265 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bmr2l" (UID: "9ded6944-ff06-4cd5-beef-4dbb3cb9aba8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:10 crc kubenswrapper[4698]: I0224 10:20:10.231944 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:20:10 crc kubenswrapper[4698]: E0224 10:20:10.232225 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:20:10.732210767 +0000 UTC m=+235.845825008 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:10 crc kubenswrapper[4698]: I0224 10:20:10.266165 4698 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-lcjcd container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.12:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 24 10:20:10 crc kubenswrapper[4698]: I0224 10:20:10.266229 4698 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-lcjcd" podUID="50f7a0ea-7b15-487b-b907-6fb4c7451eed" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.12:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 24 10:20:10 crc kubenswrapper[4698]: I0224 10:20:10.266813 4698 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-lcjcd container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.12:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 24 10:20:10 crc kubenswrapper[4698]: I0224 10:20:10.266835 4698 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-lcjcd" podUID="50f7a0ea-7b15-487b-b907-6fb4c7451eed" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.12:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 24 10:20:10 crc kubenswrapper[4698]: I0224 10:20:10.310250 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-lcjcd" Feb 24 10:20:10 crc kubenswrapper[4698]: I0224 10:20:10.333934 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bmr2l\" (UID: \"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8\") " pod="openshift-image-registry/image-registry-697d97f7c8-bmr2l" Feb 24 10:20:10 crc kubenswrapper[4698]: E0224 10:20:10.334292 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:20:10.834280209 +0000 UTC m=+235.947894450 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bmr2l" (UID: "9ded6944-ff06-4cd5-beef-4dbb3cb9aba8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:10 crc kubenswrapper[4698]: I0224 10:20:10.400187 4698 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bfmpl" Feb 24 10:20:10 crc kubenswrapper[4698]: I0224 10:20:10.439775 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:20:10 crc kubenswrapper[4698]: E0224 10:20:10.440061 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:20:10.940046005 +0000 UTC m=+236.053660246 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:10 crc kubenswrapper[4698]: I0224 10:20:10.529537 4698 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-w8bmq" Feb 24 10:20:10 crc kubenswrapper[4698]: I0224 10:20:10.543079 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kb89h\" (UniqueName: \"kubernetes.io/projected/7141f48f-7462-4e1a-a90f-b8ff3b9a8d9f-kube-api-access-kb89h\") pod \"7141f48f-7462-4e1a-a90f-b8ff3b9a8d9f\" (UID: \"7141f48f-7462-4e1a-a90f-b8ff3b9a8d9f\") " Feb 24 10:20:10 crc kubenswrapper[4698]: I0224 10:20:10.543127 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7141f48f-7462-4e1a-a90f-b8ff3b9a8d9f-config\") pod \"7141f48f-7462-4e1a-a90f-b8ff3b9a8d9f\" (UID: \"7141f48f-7462-4e1a-a90f-b8ff3b9a8d9f\") " Feb 24 10:20:10 crc kubenswrapper[4698]: I0224 10:20:10.543155 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7141f48f-7462-4e1a-a90f-b8ff3b9a8d9f-client-ca\") pod \"7141f48f-7462-4e1a-a90f-b8ff3b9a8d9f\" (UID: \"7141f48f-7462-4e1a-a90f-b8ff3b9a8d9f\") " Feb 24 10:20:10 crc kubenswrapper[4698]: I0224 10:20:10.543311 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7141f48f-7462-4e1a-a90f-b8ff3b9a8d9f-serving-cert\") pod \"7141f48f-7462-4e1a-a90f-b8ff3b9a8d9f\" (UID: \"7141f48f-7462-4e1a-a90f-b8ff3b9a8d9f\") " Feb 24 10:20:10 crc kubenswrapper[4698]: I0224 10:20:10.543572 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bmr2l\" (UID: \"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8\") " pod="openshift-image-registry/image-registry-697d97f7c8-bmr2l" Feb 24 10:20:10 crc kubenswrapper[4698]: E0224 10:20:10.543903 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:20:11.043888918 +0000 UTC m=+236.157503159 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bmr2l" (UID: "9ded6944-ff06-4cd5-beef-4dbb3cb9aba8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:10 crc kubenswrapper[4698]: I0224 10:20:10.545642 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7141f48f-7462-4e1a-a90f-b8ff3b9a8d9f-client-ca" (OuterVolumeSpecName: "client-ca") pod "7141f48f-7462-4e1a-a90f-b8ff3b9a8d9f" (UID: "7141f48f-7462-4e1a-a90f-b8ff3b9a8d9f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:20:10 crc kubenswrapper[4698]: I0224 10:20:10.546028 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7141f48f-7462-4e1a-a90f-b8ff3b9a8d9f-config" (OuterVolumeSpecName: "config") pod "7141f48f-7462-4e1a-a90f-b8ff3b9a8d9f" (UID: "7141f48f-7462-4e1a-a90f-b8ff3b9a8d9f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:20:10 crc kubenswrapper[4698]: I0224 10:20:10.554143 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7141f48f-7462-4e1a-a90f-b8ff3b9a8d9f-kube-api-access-kb89h" (OuterVolumeSpecName: "kube-api-access-kb89h") pod "7141f48f-7462-4e1a-a90f-b8ff3b9a8d9f" (UID: "7141f48f-7462-4e1a-a90f-b8ff3b9a8d9f"). InnerVolumeSpecName "kube-api-access-kb89h". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:20:10 crc kubenswrapper[4698]: I0224 10:20:10.555084 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7141f48f-7462-4e1a-a90f-b8ff3b9a8d9f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7141f48f-7462-4e1a-a90f-b8ff3b9a8d9f" (UID: "7141f48f-7462-4e1a-a90f-b8ff3b9a8d9f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:20:10 crc kubenswrapper[4698]: I0224 10:20:10.585574 4698 generic.go:334] "Generic (PLEG): container finished" podID="7141f48f-7462-4e1a-a90f-b8ff3b9a8d9f" containerID="42c76dca03b0dc1d84b11b5d0a93719eaccc1e2cfed76f09c96ba776a1105642" exitCode=0 Feb 24 10:20:10 crc kubenswrapper[4698]: I0224 10:20:10.585607 4698 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bfmpl" Feb 24 10:20:10 crc kubenswrapper[4698]: I0224 10:20:10.585766 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bfmpl" event={"ID":"7141f48f-7462-4e1a-a90f-b8ff3b9a8d9f","Type":"ContainerDied","Data":"42c76dca03b0dc1d84b11b5d0a93719eaccc1e2cfed76f09c96ba776a1105642"} Feb 24 10:20:10 crc kubenswrapper[4698]: I0224 10:20:10.586046 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bfmpl" event={"ID":"7141f48f-7462-4e1a-a90f-b8ff3b9a8d9f","Type":"ContainerDied","Data":"a8452b76f874839619a1fc704f7226d4521298df8401c72b5f21234bde282250"} Feb 24 10:20:10 crc kubenswrapper[4698]: I0224 10:20:10.586112 4698 scope.go:117] "RemoveContainer" containerID="42c76dca03b0dc1d84b11b5d0a93719eaccc1e2cfed76f09c96ba776a1105642" Feb 24 10:20:10 crc kubenswrapper[4698]: I0224 10:20:10.587798 4698 generic.go:334] "Generic (PLEG): container finished" podID="38bdf14d-35ac-440b-9a16-9a4ddd53df34" containerID="25dda536ac674dae1f804711c11daa5514f71382564e0f5a04bed966cdb2995c" exitCode=0 Feb 24 10:20:10 crc kubenswrapper[4698]: I0224 10:20:10.587927 4698 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-w8bmq" Feb 24 10:20:10 crc kubenswrapper[4698]: I0224 10:20:10.587932 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-w8bmq" event={"ID":"38bdf14d-35ac-440b-9a16-9a4ddd53df34","Type":"ContainerDied","Data":"25dda536ac674dae1f804711c11daa5514f71382564e0f5a04bed966cdb2995c"} Feb 24 10:20:10 crc kubenswrapper[4698]: I0224 10:20:10.588094 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-w8bmq" event={"ID":"38bdf14d-35ac-440b-9a16-9a4ddd53df34","Type":"ContainerDied","Data":"7f6aa9a1f37035e1ace1fe1d70b4313c6f79e1d8550e2f9e8fa80e3d07fd539b"} Feb 24 10:20:10 crc kubenswrapper[4698]: I0224 10:20:10.597833 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-fbt6k" event={"ID":"19c1f910-f805-4151-8eb8-7a6628a62b5b","Type":"ContainerStarted","Data":"0e2bf4cad9c9a9283184a21cdfffb04d0e0e5e2f330c7a4f737f985a114c78f8"} Feb 24 10:20:10 crc kubenswrapper[4698]: I0224 10:20:10.611748 4698 scope.go:117] "RemoveContainer" containerID="42c76dca03b0dc1d84b11b5d0a93719eaccc1e2cfed76f09c96ba776a1105642" Feb 24 10:20:10 crc kubenswrapper[4698]: E0224 10:20:10.613396 4698 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42c76dca03b0dc1d84b11b5d0a93719eaccc1e2cfed76f09c96ba776a1105642\": container with ID starting with 42c76dca03b0dc1d84b11b5d0a93719eaccc1e2cfed76f09c96ba776a1105642 not found: ID does not exist" containerID="42c76dca03b0dc1d84b11b5d0a93719eaccc1e2cfed76f09c96ba776a1105642" Feb 24 10:20:10 crc kubenswrapper[4698]: I0224 10:20:10.613427 4698 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42c76dca03b0dc1d84b11b5d0a93719eaccc1e2cfed76f09c96ba776a1105642"} err="failed to get container status \"42c76dca03b0dc1d84b11b5d0a93719eaccc1e2cfed76f09c96ba776a1105642\": rpc error: code = NotFound desc = could not find container \"42c76dca03b0dc1d84b11b5d0a93719eaccc1e2cfed76f09c96ba776a1105642\": container with ID starting with 42c76dca03b0dc1d84b11b5d0a93719eaccc1e2cfed76f09c96ba776a1105642 not found: ID does not exist" Feb 24 10:20:10 crc kubenswrapper[4698]: I0224 10:20:10.613446 4698 scope.go:117] "RemoveContainer" containerID="25dda536ac674dae1f804711c11daa5514f71382564e0f5a04bed966cdb2995c" Feb 24 10:20:10 crc kubenswrapper[4698]: I0224 10:20:10.616797 4698 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-bfmpl"] Feb 24 10:20:10 crc kubenswrapper[4698]: I0224 10:20:10.620578 4698 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-bfmpl"] Feb 24 10:20:10 crc kubenswrapper[4698]: I0224 10:20:10.622527 4698 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8hhkf"] Feb 24 10:20:10 crc kubenswrapper[4698]: E0224 10:20:10.622700 4698 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38bdf14d-35ac-440b-9a16-9a4ddd53df34" containerName="controller-manager" Feb 24 10:20:10 crc kubenswrapper[4698]: I0224 10:20:10.622716 4698 state_mem.go:107] "Deleted CPUSet assignment" podUID="38bdf14d-35ac-440b-9a16-9a4ddd53df34" containerName="controller-manager" Feb 24 10:20:10 crc kubenswrapper[4698]: E0224 10:20:10.622727 4698 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7141f48f-7462-4e1a-a90f-b8ff3b9a8d9f" containerName="route-controller-manager" Feb 24 10:20:10 crc kubenswrapper[4698]: I0224 10:20:10.622733 4698 state_mem.go:107] "Deleted CPUSet assignment" podUID="7141f48f-7462-4e1a-a90f-b8ff3b9a8d9f" containerName="route-controller-manager" Feb 24 10:20:10 crc kubenswrapper[4698]: I0224 10:20:10.622830 4698 memory_manager.go:354] "RemoveStaleState removing state" podUID="38bdf14d-35ac-440b-9a16-9a4ddd53df34" containerName="controller-manager" Feb 24 10:20:10 crc kubenswrapper[4698]: I0224 10:20:10.622843 4698 memory_manager.go:354] "RemoveStaleState removing state" podUID="7141f48f-7462-4e1a-a90f-b8ff3b9a8d9f" containerName="route-controller-manager" Feb 24 10:20:10 crc kubenswrapper[4698]: I0224 10:20:10.623547 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8hhkf" Feb 24 10:20:10 crc kubenswrapper[4698]: I0224 10:20:10.625192 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 24 10:20:10 crc kubenswrapper[4698]: I0224 10:20:10.629773 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8hhkf"] Feb 24 10:20:10 crc kubenswrapper[4698]: I0224 10:20:10.643861 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/38bdf14d-35ac-440b-9a16-9a4ddd53df34-proxy-ca-bundles\") pod \"38bdf14d-35ac-440b-9a16-9a4ddd53df34\" (UID: \"38bdf14d-35ac-440b-9a16-9a4ddd53df34\") " Feb 24 10:20:10 crc kubenswrapper[4698]: I0224 10:20:10.643961 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/38bdf14d-35ac-440b-9a16-9a4ddd53df34-serving-cert\") pod \"38bdf14d-35ac-440b-9a16-9a4ddd53df34\" (UID: \"38bdf14d-35ac-440b-9a16-9a4ddd53df34\") " Feb 24 10:20:10 crc kubenswrapper[4698]: I0224 10:20:10.643985 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h46gj\" (UniqueName: \"kubernetes.io/projected/38bdf14d-35ac-440b-9a16-9a4ddd53df34-kube-api-access-h46gj\") pod \"38bdf14d-35ac-440b-9a16-9a4ddd53df34\" (UID: \"38bdf14d-35ac-440b-9a16-9a4ddd53df34\") " Feb 24 10:20:10 crc kubenswrapper[4698]: I0224 10:20:10.644008 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38bdf14d-35ac-440b-9a16-9a4ddd53df34-config\") pod \"38bdf14d-35ac-440b-9a16-9a4ddd53df34\" (UID: \"38bdf14d-35ac-440b-9a16-9a4ddd53df34\") " Feb 24 10:20:10 crc kubenswrapper[4698]: I0224 10:20:10.644666 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38bdf14d-35ac-440b-9a16-9a4ddd53df34-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "38bdf14d-35ac-440b-9a16-9a4ddd53df34" (UID: "38bdf14d-35ac-440b-9a16-9a4ddd53df34"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:20:10 crc kubenswrapper[4698]: I0224 10:20:10.645151 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38bdf14d-35ac-440b-9a16-9a4ddd53df34-config" (OuterVolumeSpecName: "config") pod "38bdf14d-35ac-440b-9a16-9a4ddd53df34" (UID: "38bdf14d-35ac-440b-9a16-9a4ddd53df34"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:20:10 crc kubenswrapper[4698]: I0224 10:20:10.645244 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:20:10 crc kubenswrapper[4698]: I0224 10:20:10.645314 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/38bdf14d-35ac-440b-9a16-9a4ddd53df34-client-ca\") pod \"38bdf14d-35ac-440b-9a16-9a4ddd53df34\" (UID: \"38bdf14d-35ac-440b-9a16-9a4ddd53df34\") " Feb 24 10:20:10 crc kubenswrapper[4698]: I0224 10:20:10.645590 4698 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7141f48f-7462-4e1a-a90f-b8ff3b9a8d9f-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 10:20:10 crc kubenswrapper[4698]: I0224 10:20:10.645611 4698 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38bdf14d-35ac-440b-9a16-9a4ddd53df34-config\") on node \"crc\" DevicePath \"\"" Feb 24 10:20:10 crc kubenswrapper[4698]: I0224 10:20:10.645623 4698 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kb89h\" (UniqueName: \"kubernetes.io/projected/7141f48f-7462-4e1a-a90f-b8ff3b9a8d9f-kube-api-access-kb89h\") on node \"crc\" DevicePath \"\"" Feb 24 10:20:10 crc kubenswrapper[4698]: I0224 10:20:10.645634 4698 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7141f48f-7462-4e1a-a90f-b8ff3b9a8d9f-config\") on node \"crc\" DevicePath \"\"" Feb 24 10:20:10 crc kubenswrapper[4698]: I0224 10:20:10.645644 4698 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7141f48f-7462-4e1a-a90f-b8ff3b9a8d9f-client-ca\") on node \"crc\" DevicePath \"\"" Feb 24 10:20:10 crc kubenswrapper[4698]: I0224 10:20:10.645652 4698 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/38bdf14d-35ac-440b-9a16-9a4ddd53df34-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 24 10:20:10 crc kubenswrapper[4698]: I0224 10:20:10.646103 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38bdf14d-35ac-440b-9a16-9a4ddd53df34-client-ca" (OuterVolumeSpecName: "client-ca") pod "38bdf14d-35ac-440b-9a16-9a4ddd53df34" (UID: "38bdf14d-35ac-440b-9a16-9a4ddd53df34"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:20:10 crc kubenswrapper[4698]: E0224 10:20:10.646167 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:20:11.146150804 +0000 UTC m=+236.259765045 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:10 crc kubenswrapper[4698]: I0224 10:20:10.647513 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38bdf14d-35ac-440b-9a16-9a4ddd53df34-kube-api-access-h46gj" (OuterVolumeSpecName: "kube-api-access-h46gj") pod "38bdf14d-35ac-440b-9a16-9a4ddd53df34" (UID: "38bdf14d-35ac-440b-9a16-9a4ddd53df34"). InnerVolumeSpecName "kube-api-access-h46gj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:20:10 crc kubenswrapper[4698]: I0224 10:20:10.647625 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38bdf14d-35ac-440b-9a16-9a4ddd53df34-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "38bdf14d-35ac-440b-9a16-9a4ddd53df34" (UID: "38bdf14d-35ac-440b-9a16-9a4ddd53df34"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:20:10 crc kubenswrapper[4698]: I0224 10:20:10.653451 4698 scope.go:117] "RemoveContainer" containerID="25dda536ac674dae1f804711c11daa5514f71382564e0f5a04bed966cdb2995c" Feb 24 10:20:10 crc kubenswrapper[4698]: E0224 10:20:10.657352 4698 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25dda536ac674dae1f804711c11daa5514f71382564e0f5a04bed966cdb2995c\": container with ID starting with 25dda536ac674dae1f804711c11daa5514f71382564e0f5a04bed966cdb2995c not found: ID does not exist" containerID="25dda536ac674dae1f804711c11daa5514f71382564e0f5a04bed966cdb2995c" Feb 24 10:20:10 crc kubenswrapper[4698]: I0224 10:20:10.657391 4698 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25dda536ac674dae1f804711c11daa5514f71382564e0f5a04bed966cdb2995c"} err="failed to get container status \"25dda536ac674dae1f804711c11daa5514f71382564e0f5a04bed966cdb2995c\": rpc error: code = NotFound desc = could not find container \"25dda536ac674dae1f804711c11daa5514f71382564e0f5a04bed966cdb2995c\": container with ID starting with 25dda536ac674dae1f804711c11daa5514f71382564e0f5a04bed966cdb2995c not found: ID does not exist" Feb 24 10:20:10 crc kubenswrapper[4698]: I0224 10:20:10.746578 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2eee2a16-171b-402e-9549-3d14cb56cddc-utilities\") pod \"community-operators-8hhkf\" (UID: \"2eee2a16-171b-402e-9549-3d14cb56cddc\") " pod="openshift-marketplace/community-operators-8hhkf" Feb 24 10:20:10 crc kubenswrapper[4698]: I0224 10:20:10.746729 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bmr2l\" (UID: \"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8\") " pod="openshift-image-registry/image-registry-697d97f7c8-bmr2l" Feb 24 10:20:10 crc kubenswrapper[4698]: I0224 10:20:10.746774 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zz7zw\" (UniqueName: \"kubernetes.io/projected/2eee2a16-171b-402e-9549-3d14cb56cddc-kube-api-access-zz7zw\") pod \"community-operators-8hhkf\" (UID: \"2eee2a16-171b-402e-9549-3d14cb56cddc\") " pod="openshift-marketplace/community-operators-8hhkf" Feb 24 10:20:10 crc kubenswrapper[4698]: I0224 10:20:10.746852 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2eee2a16-171b-402e-9549-3d14cb56cddc-catalog-content\") pod \"community-operators-8hhkf\" (UID: \"2eee2a16-171b-402e-9549-3d14cb56cddc\") " pod="openshift-marketplace/community-operators-8hhkf" Feb 24 10:20:10 crc kubenswrapper[4698]: I0224 10:20:10.746960 4698 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/38bdf14d-35ac-440b-9a16-9a4ddd53df34-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 10:20:10 crc kubenswrapper[4698]: I0224 10:20:10.746984 4698 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h46gj\" (UniqueName: \"kubernetes.io/projected/38bdf14d-35ac-440b-9a16-9a4ddd53df34-kube-api-access-h46gj\") on node \"crc\" DevicePath \"\"" Feb 24 10:20:10 crc kubenswrapper[4698]: I0224 10:20:10.747006 4698 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/38bdf14d-35ac-440b-9a16-9a4ddd53df34-client-ca\") on node \"crc\" DevicePath \"\"" Feb 24 10:20:10 crc kubenswrapper[4698]: E0224 10:20:10.747868 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:20:11.247789915 +0000 UTC m=+236.361404156 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bmr2l" (UID: "9ded6944-ff06-4cd5-beef-4dbb3cb9aba8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:10 crc kubenswrapper[4698]: I0224 10:20:10.824277 4698 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-p5dwb"] Feb 24 10:20:10 crc kubenswrapper[4698]: I0224 10:20:10.825100 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p5dwb" Feb 24 10:20:10 crc kubenswrapper[4698]: I0224 10:20:10.828019 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 24 10:20:10 crc kubenswrapper[4698]: I0224 10:20:10.838474 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-p5dwb"] Feb 24 10:20:10 crc kubenswrapper[4698]: I0224 10:20:10.848026 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:20:10 crc kubenswrapper[4698]: I0224 10:20:10.848271 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zz7zw\" (UniqueName: \"kubernetes.io/projected/2eee2a16-171b-402e-9549-3d14cb56cddc-kube-api-access-zz7zw\") pod \"community-operators-8hhkf\" (UID: \"2eee2a16-171b-402e-9549-3d14cb56cddc\") " pod="openshift-marketplace/community-operators-8hhkf" Feb 24 10:20:10 crc kubenswrapper[4698]: E0224 10:20:10.848379 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:20:11.348351592 +0000 UTC m=+236.461965843 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:10 crc kubenswrapper[4698]: I0224 10:20:10.848482 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2eee2a16-171b-402e-9549-3d14cb56cddc-catalog-content\") pod \"community-operators-8hhkf\" (UID: \"2eee2a16-171b-402e-9549-3d14cb56cddc\") " pod="openshift-marketplace/community-operators-8hhkf" Feb 24 10:20:10 crc kubenswrapper[4698]: I0224 10:20:10.848659 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2eee2a16-171b-402e-9549-3d14cb56cddc-utilities\") pod \"community-operators-8hhkf\" (UID: \"2eee2a16-171b-402e-9549-3d14cb56cddc\") " pod="openshift-marketplace/community-operators-8hhkf" Feb 24 10:20:10 crc kubenswrapper[4698]: I0224 10:20:10.849282 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2eee2a16-171b-402e-9549-3d14cb56cddc-catalog-content\") pod \"community-operators-8hhkf\" (UID: \"2eee2a16-171b-402e-9549-3d14cb56cddc\") " pod="openshift-marketplace/community-operators-8hhkf" Feb 24 10:20:10 crc kubenswrapper[4698]: I0224 10:20:10.849310 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2eee2a16-171b-402e-9549-3d14cb56cddc-utilities\") pod \"community-operators-8hhkf\" (UID: \"2eee2a16-171b-402e-9549-3d14cb56cddc\") " pod="openshift-marketplace/community-operators-8hhkf" Feb 24 10:20:10 crc kubenswrapper[4698]: I0224 10:20:10.882099 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zz7zw\" (UniqueName: \"kubernetes.io/projected/2eee2a16-171b-402e-9549-3d14cb56cddc-kube-api-access-zz7zw\") pod \"community-operators-8hhkf\" (UID: \"2eee2a16-171b-402e-9549-3d14cb56cddc\") " pod="openshift-marketplace/community-operators-8hhkf" Feb 24 10:20:10 crc kubenswrapper[4698]: I0224 10:20:10.914673 4698 patch_prober.go:28] interesting pod/router-default-5444994796-trk7h container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 24 10:20:10 crc kubenswrapper[4698]: [-]has-synced failed: reason withheld Feb 24 10:20:10 crc kubenswrapper[4698]: [+]process-running ok Feb 24 10:20:10 crc kubenswrapper[4698]: healthz check failed Feb 24 10:20:10 crc kubenswrapper[4698]: I0224 10:20:10.914746 4698 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-trk7h" podUID="b44b8c72-3ca2-4fbe-aa3f-9ab7917b1658" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 24 10:20:10 crc kubenswrapper[4698]: I0224 10:20:10.936459 4698 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-w8bmq"] Feb 24 10:20:10 crc kubenswrapper[4698]: I0224 10:20:10.938961 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8hhkf" Feb 24 10:20:10 crc kubenswrapper[4698]: I0224 10:20:10.939639 4698 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-w8bmq"] Feb 24 10:20:10 crc kubenswrapper[4698]: I0224 10:20:10.948134 4698 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Feb 24 10:20:10 crc kubenswrapper[4698]: I0224 10:20:10.952463 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bmr2l\" (UID: \"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8\") " pod="openshift-image-registry/image-registry-697d97f7c8-bmr2l" Feb 24 10:20:10 crc kubenswrapper[4698]: I0224 10:20:10.952585 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b27gl\" (UniqueName: \"kubernetes.io/projected/19022af1-394c-4aab-9eb1-ffb0f566d0ac-kube-api-access-b27gl\") pod \"certified-operators-p5dwb\" (UID: \"19022af1-394c-4aab-9eb1-ffb0f566d0ac\") " pod="openshift-marketplace/certified-operators-p5dwb" Feb 24 10:20:10 crc kubenswrapper[4698]: I0224 10:20:10.952608 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19022af1-394c-4aab-9eb1-ffb0f566d0ac-catalog-content\") pod \"certified-operators-p5dwb\" (UID: \"19022af1-394c-4aab-9eb1-ffb0f566d0ac\") " pod="openshift-marketplace/certified-operators-p5dwb" Feb 24 10:20:10 crc kubenswrapper[4698]: I0224 10:20:10.952634 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19022af1-394c-4aab-9eb1-ffb0f566d0ac-utilities\") pod \"certified-operators-p5dwb\" (UID: \"19022af1-394c-4aab-9eb1-ffb0f566d0ac\") " pod="openshift-marketplace/certified-operators-p5dwb" Feb 24 10:20:10 crc kubenswrapper[4698]: E0224 10:20:10.952960 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:20:11.452946371 +0000 UTC m=+236.566560612 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bmr2l" (UID: "9ded6944-ff06-4cd5-beef-4dbb3cb9aba8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.026473 4698 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2z8bk"] Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.027583 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2z8bk" Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.088330 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.088544 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b27gl\" (UniqueName: \"kubernetes.io/projected/19022af1-394c-4aab-9eb1-ffb0f566d0ac-kube-api-access-b27gl\") pod \"certified-operators-p5dwb\" (UID: \"19022af1-394c-4aab-9eb1-ffb0f566d0ac\") " pod="openshift-marketplace/certified-operators-p5dwb" Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.088575 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19022af1-394c-4aab-9eb1-ffb0f566d0ac-catalog-content\") pod \"certified-operators-p5dwb\" (UID: \"19022af1-394c-4aab-9eb1-ffb0f566d0ac\") " pod="openshift-marketplace/certified-operators-p5dwb" Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.088596 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19022af1-394c-4aab-9eb1-ffb0f566d0ac-utilities\") pod \"certified-operators-p5dwb\" (UID: \"19022af1-394c-4aab-9eb1-ffb0f566d0ac\") " pod="openshift-marketplace/certified-operators-p5dwb" Feb 24 10:20:11 crc kubenswrapper[4698]: E0224 10:20:11.090315 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:20:11.590296803 +0000 UTC m=+236.703911034 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.090706 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19022af1-394c-4aab-9eb1-ffb0f566d0ac-utilities\") pod \"certified-operators-p5dwb\" (UID: \"19022af1-394c-4aab-9eb1-ffb0f566d0ac\") " pod="openshift-marketplace/certified-operators-p5dwb" Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.090734 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19022af1-394c-4aab-9eb1-ffb0f566d0ac-catalog-content\") pod \"certified-operators-p5dwb\" (UID: \"19022af1-394c-4aab-9eb1-ffb0f566d0ac\") " pod="openshift-marketplace/certified-operators-p5dwb" Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.106168 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b27gl\" (UniqueName: \"kubernetes.io/projected/19022af1-394c-4aab-9eb1-ffb0f566d0ac-kube-api-access-b27gl\") pod \"certified-operators-p5dwb\" (UID: \"19022af1-394c-4aab-9eb1-ffb0f566d0ac\") " pod="openshift-marketplace/certified-operators-p5dwb" Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.115277 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2z8bk"] Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.149701 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p5dwb" Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.190473 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5149fd4f-19d7-4852-b09a-d9909b8231dd-utilities\") pod \"community-operators-2z8bk\" (UID: \"5149fd4f-19d7-4852-b09a-d9909b8231dd\") " pod="openshift-marketplace/community-operators-2z8bk" Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.190526 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bmr2l\" (UID: \"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8\") " pod="openshift-image-registry/image-registry-697d97f7c8-bmr2l" Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.190549 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5149fd4f-19d7-4852-b09a-d9909b8231dd-catalog-content\") pod \"community-operators-2z8bk\" (UID: \"5149fd4f-19d7-4852-b09a-d9909b8231dd\") " pod="openshift-marketplace/community-operators-2z8bk" Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.190590 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w26kv\" (UniqueName: \"kubernetes.io/projected/5149fd4f-19d7-4852-b09a-d9909b8231dd-kube-api-access-w26kv\") pod \"community-operators-2z8bk\" (UID: \"5149fd4f-19d7-4852-b09a-d9909b8231dd\") " pod="openshift-marketplace/community-operators-2z8bk" Feb 24 10:20:11 crc kubenswrapper[4698]: E0224 10:20:11.191153 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:20:11.691141976 +0000 UTC m=+236.804756217 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bmr2l" (UID: "9ded6944-ff06-4cd5-beef-4dbb3cb9aba8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.237464 4698 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-59zd2"] Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.238422 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-59zd2" Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.241077 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8hhkf"] Feb 24 10:20:11 crc kubenswrapper[4698]: W0224 10:20:11.248172 4698 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2eee2a16_171b_402e_9549_3d14cb56cddc.slice/crio-472cf45ebe9ea39b507669fabcea23daff363164b2f5a031403b9d5d12c57ea3 WatchSource:0}: Error finding container 472cf45ebe9ea39b507669fabcea23daff363164b2f5a031403b9d5d12c57ea3: Status 404 returned error can't find the container with id 472cf45ebe9ea39b507669fabcea23daff363164b2f5a031403b9d5d12c57ea3 Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.248704 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-59zd2"] Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.292331 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.292650 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhsl8\" (UniqueName: \"kubernetes.io/projected/d16dadf6-b01e-4bda-b24b-d63801c9bf23-kube-api-access-hhsl8\") pod \"certified-operators-59zd2\" (UID: \"d16dadf6-b01e-4bda-b24b-d63801c9bf23\") " pod="openshift-marketplace/certified-operators-59zd2" Feb 24 10:20:11 crc kubenswrapper[4698]: E0224 10:20:11.293007 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:20:11.792989422 +0000 UTC m=+236.906603663 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.293063 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d16dadf6-b01e-4bda-b24b-d63801c9bf23-utilities\") pod \"certified-operators-59zd2\" (UID: \"d16dadf6-b01e-4bda-b24b-d63801c9bf23\") " pod="openshift-marketplace/certified-operators-59zd2" Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.293095 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5149fd4f-19d7-4852-b09a-d9909b8231dd-utilities\") pod \"community-operators-2z8bk\" (UID: \"5149fd4f-19d7-4852-b09a-d9909b8231dd\") " pod="openshift-marketplace/community-operators-2z8bk" Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.293121 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5149fd4f-19d7-4852-b09a-d9909b8231dd-catalog-content\") pod \"community-operators-2z8bk\" (UID: \"5149fd4f-19d7-4852-b09a-d9909b8231dd\") " pod="openshift-marketplace/community-operators-2z8bk" Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.293141 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bmr2l\" (UID: \"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8\") " pod="openshift-image-registry/image-registry-697d97f7c8-bmr2l" Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.293172 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d16dadf6-b01e-4bda-b24b-d63801c9bf23-catalog-content\") pod \"certified-operators-59zd2\" (UID: \"d16dadf6-b01e-4bda-b24b-d63801c9bf23\") " pod="openshift-marketplace/certified-operators-59zd2" Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.293193 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w26kv\" (UniqueName: \"kubernetes.io/projected/5149fd4f-19d7-4852-b09a-d9909b8231dd-kube-api-access-w26kv\") pod \"community-operators-2z8bk\" (UID: \"5149fd4f-19d7-4852-b09a-d9909b8231dd\") " pod="openshift-marketplace/community-operators-2z8bk" Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.293520 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5149fd4f-19d7-4852-b09a-d9909b8231dd-utilities\") pod \"community-operators-2z8bk\" (UID: \"5149fd4f-19d7-4852-b09a-d9909b8231dd\") " pod="openshift-marketplace/community-operators-2z8bk" Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.293706 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5149fd4f-19d7-4852-b09a-d9909b8231dd-catalog-content\") pod \"community-operators-2z8bk\" (UID: \"5149fd4f-19d7-4852-b09a-d9909b8231dd\") " pod="openshift-marketplace/community-operators-2z8bk" Feb 24 10:20:11 crc kubenswrapper[4698]: E0224 10:20:11.294131 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:20:11.794116008 +0000 UTC m=+236.907730249 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bmr2l" (UID: "9ded6944-ff06-4cd5-beef-4dbb3cb9aba8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.316475 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w26kv\" (UniqueName: \"kubernetes.io/projected/5149fd4f-19d7-4852-b09a-d9909b8231dd-kube-api-access-w26kv\") pod \"community-operators-2z8bk\" (UID: \"5149fd4f-19d7-4852-b09a-d9909b8231dd\") " pod="openshift-marketplace/community-operators-2z8bk" Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.397861 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-p5dwb"] Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.398204 4698 ???:1] "http: TLS handshake error from 192.168.126.11:55178: no serving certificate available for the kubelet" Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.398631 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.398873 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d16dadf6-b01e-4bda-b24b-d63801c9bf23-catalog-content\") pod \"certified-operators-59zd2\" (UID: \"d16dadf6-b01e-4bda-b24b-d63801c9bf23\") " pod="openshift-marketplace/certified-operators-59zd2" Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.398903 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhsl8\" (UniqueName: \"kubernetes.io/projected/d16dadf6-b01e-4bda-b24b-d63801c9bf23-kube-api-access-hhsl8\") pod \"certified-operators-59zd2\" (UID: \"d16dadf6-b01e-4bda-b24b-d63801c9bf23\") " pod="openshift-marketplace/certified-operators-59zd2" Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.398945 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d16dadf6-b01e-4bda-b24b-d63801c9bf23-utilities\") pod \"certified-operators-59zd2\" (UID: \"d16dadf6-b01e-4bda-b24b-d63801c9bf23\") " pod="openshift-marketplace/certified-operators-59zd2" Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.399328 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d16dadf6-b01e-4bda-b24b-d63801c9bf23-utilities\") pod \"certified-operators-59zd2\" (UID: \"d16dadf6-b01e-4bda-b24b-d63801c9bf23\") " pod="openshift-marketplace/certified-operators-59zd2" Feb 24 10:20:11 crc kubenswrapper[4698]: E0224 10:20:11.399393 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:20:11.899379984 +0000 UTC m=+237.012994225 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.399598 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d16dadf6-b01e-4bda-b24b-d63801c9bf23-catalog-content\") pod \"certified-operators-59zd2\" (UID: \"d16dadf6-b01e-4bda-b24b-d63801c9bf23\") " pod="openshift-marketplace/certified-operators-59zd2" Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.432911 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhsl8\" (UniqueName: \"kubernetes.io/projected/d16dadf6-b01e-4bda-b24b-d63801c9bf23-kube-api-access-hhsl8\") pod \"certified-operators-59zd2\" (UID: \"d16dadf6-b01e-4bda-b24b-d63801c9bf23\") " pod="openshift-marketplace/certified-operators-59zd2" Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.438463 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2z8bk" Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.499904 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bmr2l\" (UID: \"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8\") " pod="openshift-image-registry/image-registry-697d97f7c8-bmr2l" Feb 24 10:20:11 crc kubenswrapper[4698]: E0224 10:20:11.500315 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:20:12.000299368 +0000 UTC m=+237.113913609 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bmr2l" (UID: "9ded6944-ff06-4cd5-beef-4dbb3cb9aba8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.603487 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:20:11 crc kubenswrapper[4698]: E0224 10:20:11.603750 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:20:12.103703301 +0000 UTC m=+237.217317592 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.614754 4698 generic.go:334] "Generic (PLEG): container finished" podID="19022af1-394c-4aab-9eb1-ffb0f566d0ac" containerID="cc4ae6b045233402d1e99fcc621d900e4d4a1f727d1d78289575c72a48379b26" exitCode=0 Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.616810 4698 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.620427 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-59zd2" Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.625980 4698 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38bdf14d-35ac-440b-9a16-9a4ddd53df34" path="/var/lib/kubelet/pods/38bdf14d-35ac-440b-9a16-9a4ddd53df34/volumes" Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.626756 4698 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7141f48f-7462-4e1a-a90f-b8ff3b9a8d9f" path="/var/lib/kubelet/pods/7141f48f-7462-4e1a-a90f-b8ff3b9a8d9f/volumes" Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.628120 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p5dwb" event={"ID":"19022af1-394c-4aab-9eb1-ffb0f566d0ac","Type":"ContainerDied","Data":"cc4ae6b045233402d1e99fcc621d900e4d4a1f727d1d78289575c72a48379b26"} Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.628756 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p5dwb" event={"ID":"19022af1-394c-4aab-9eb1-ffb0f566d0ac","Type":"ContainerStarted","Data":"d9485b1a7189bca91d54d193b3de764840766a0cb914177787a5f204b8ad2a6c"} Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.628774 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-fbt6k" event={"ID":"19c1f910-f805-4151-8eb8-7a6628a62b5b","Type":"ContainerStarted","Data":"44e3d26408600fb2bda861f31cf1aea2240d33e7e54e88cb201363daa40e0b69"} Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.628785 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-fbt6k" event={"ID":"19c1f910-f805-4151-8eb8-7a6628a62b5b","Type":"ContainerStarted","Data":"fbd96b679ce2aa371c3d11927810ec62ce6d9582b11781ddbf9be93398067fe0"} Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.629822 4698 generic.go:334] "Generic (PLEG): container finished" podID="2eee2a16-171b-402e-9549-3d14cb56cddc" containerID="a45c86071d35edeee892f9ff893fd65800461fae44ce4d762643dd1b8709070f" exitCode=0 Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.629870 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8hhkf" event={"ID":"2eee2a16-171b-402e-9549-3d14cb56cddc","Type":"ContainerDied","Data":"a45c86071d35edeee892f9ff893fd65800461fae44ce4d762643dd1b8709070f"} Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.629889 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8hhkf" event={"ID":"2eee2a16-171b-402e-9549-3d14cb56cddc","Type":"ContainerStarted","Data":"472cf45ebe9ea39b507669fabcea23daff363164b2f5a031403b9d5d12c57ea3"} Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.657731 4698 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.659111 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.662523 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.662861 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.667970 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.675787 4698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-fbt6k" podStartSLOduration=11.675764746 podStartE2EDuration="11.675764746s" podCreationTimestamp="2026-02-24 10:20:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:20:11.657688145 +0000 UTC m=+236.771302386" watchObservedRunningTime="2026-02-24 10:20:11.675764746 +0000 UTC m=+236.789378987" Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.705344 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bmr2l\" (UID: \"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8\") " pod="openshift-image-registry/image-registry-697d97f7c8-bmr2l" Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.705408 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/80fe6957-b898-4120-8959-a0e840e8c4f5-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"80fe6957-b898-4120-8959-a0e840e8c4f5\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.705425 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/80fe6957-b898-4120-8959-a0e840e8c4f5-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"80fe6957-b898-4120-8959-a0e840e8c4f5\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 24 10:20:11 crc kubenswrapper[4698]: E0224 10:20:11.707570 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-24 10:20:12.207559274 +0000 UTC m=+237.321173505 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bmr2l" (UID: "9ded6944-ff06-4cd5-beef-4dbb3cb9aba8") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.720714 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2z8bk"] Feb 24 10:20:11 crc kubenswrapper[4698]: W0224 10:20:11.731408 4698 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5149fd4f_19d7_4852_b09a_d9909b8231dd.slice/crio-ae52dc38bccc1f74a86c16be9229ddbe43c140c2e39b5f1f23bcb0f790b206d8 WatchSource:0}: Error finding container ae52dc38bccc1f74a86c16be9229ddbe43c140c2e39b5f1f23bcb0f790b206d8: Status 404 returned error can't find the container with id ae52dc38bccc1f74a86c16be9229ddbe43c140c2e39b5f1f23bcb0f790b206d8 Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.778818 4698 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b56f5b4c8-c4lgc"] Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.779501 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5b56f5b4c8-c4lgc" Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.782694 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.782833 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.782956 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.783085 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.783172 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.783279 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.787315 4698 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-d856f7db-zpnrp"] Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.808574 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b56f5b4c8-c4lgc"] Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.808680 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-d856f7db-zpnrp" Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.808732 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.808941 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/80fe6957-b898-4120-8959-a0e840e8c4f5-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"80fe6957-b898-4120-8959-a0e840e8c4f5\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.808960 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/80fe6957-b898-4120-8959-a0e840e8c4f5-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"80fe6957-b898-4120-8959-a0e840e8c4f5\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.808982 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwjzt\" (UniqueName: \"kubernetes.io/projected/1925aa4a-90ca-4879-8030-1809fec30eb7-kube-api-access-hwjzt\") pod \"route-controller-manager-5b56f5b4c8-c4lgc\" (UID: \"1925aa4a-90ca-4879-8030-1809fec30eb7\") " pod="openshift-route-controller-manager/route-controller-manager-5b56f5b4c8-c4lgc" Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.809003 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1925aa4a-90ca-4879-8030-1809fec30eb7-client-ca\") pod \"route-controller-manager-5b56f5b4c8-c4lgc\" (UID: \"1925aa4a-90ca-4879-8030-1809fec30eb7\") " pod="openshift-route-controller-manager/route-controller-manager-5b56f5b4c8-c4lgc" Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.809021 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1925aa4a-90ca-4879-8030-1809fec30eb7-serving-cert\") pod \"route-controller-manager-5b56f5b4c8-c4lgc\" (UID: \"1925aa4a-90ca-4879-8030-1809fec30eb7\") " pod="openshift-route-controller-manager/route-controller-manager-5b56f5b4c8-c4lgc" Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.809054 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1925aa4a-90ca-4879-8030-1809fec30eb7-config\") pod \"route-controller-manager-5b56f5b4c8-c4lgc\" (UID: \"1925aa4a-90ca-4879-8030-1809fec30eb7\") " pod="openshift-route-controller-manager/route-controller-manager-5b56f5b4c8-c4lgc" Feb 24 10:20:11 crc kubenswrapper[4698]: E0224 10:20:11.809130 4698 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-24 10:20:12.309118134 +0000 UTC m=+237.422732375 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.809150 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/80fe6957-b898-4120-8959-a0e840e8c4f5-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"80fe6957-b898-4120-8959-a0e840e8c4f5\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.811679 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-d856f7db-zpnrp"] Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.811892 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.812455 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.812777 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.812908 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.813017 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.813149 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.830037 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.835143 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/80fe6957-b898-4120-8959-a0e840e8c4f5-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"80fe6957-b898-4120-8959-a0e840e8c4f5\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.868638 4698 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-02-24T10:20:10.948156421Z","Handler":null,"Name":""} Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.873120 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-59zd2"] Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.877521 4698 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.877571 4698 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Feb 24 10:20:11 crc kubenswrapper[4698]: W0224 10:20:11.879017 4698 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd16dadf6_b01e_4bda_b24b_d63801c9bf23.slice/crio-70733949ca29acd9ad2f6c278baae0692c8eea2e136ecf93a964f64c5c7c0edd WatchSource:0}: Error finding container 70733949ca29acd9ad2f6c278baae0692c8eea2e136ecf93a964f64c5c7c0edd: Status 404 returned error can't find the container with id 70733949ca29acd9ad2f6c278baae0692c8eea2e136ecf93a964f64c5c7c0edd Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.909825 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1925aa4a-90ca-4879-8030-1809fec30eb7-client-ca\") pod \"route-controller-manager-5b56f5b4c8-c4lgc\" (UID: \"1925aa4a-90ca-4879-8030-1809fec30eb7\") " pod="openshift-route-controller-manager/route-controller-manager-5b56f5b4c8-c4lgc" Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.909862 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1925aa4a-90ca-4879-8030-1809fec30eb7-serving-cert\") pod \"route-controller-manager-5b56f5b4c8-c4lgc\" (UID: \"1925aa4a-90ca-4879-8030-1809fec30eb7\") " pod="openshift-route-controller-manager/route-controller-manager-5b56f5b4c8-c4lgc" Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.909883 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqbnn\" (UniqueName: \"kubernetes.io/projected/68542d0e-8ec2-4509-b5b2-2ec3d9a0de83-kube-api-access-qqbnn\") pod \"controller-manager-d856f7db-zpnrp\" (UID: \"68542d0e-8ec2-4509-b5b2-2ec3d9a0de83\") " pod="openshift-controller-manager/controller-manager-d856f7db-zpnrp" Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.909938 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1925aa4a-90ca-4879-8030-1809fec30eb7-config\") pod \"route-controller-manager-5b56f5b4c8-c4lgc\" (UID: \"1925aa4a-90ca-4879-8030-1809fec30eb7\") " pod="openshift-route-controller-manager/route-controller-manager-5b56f5b4c8-c4lgc" Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.909980 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/68542d0e-8ec2-4509-b5b2-2ec3d9a0de83-proxy-ca-bundles\") pod \"controller-manager-d856f7db-zpnrp\" (UID: \"68542d0e-8ec2-4509-b5b2-2ec3d9a0de83\") " pod="openshift-controller-manager/controller-manager-d856f7db-zpnrp" Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.910006 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/68542d0e-8ec2-4509-b5b2-2ec3d9a0de83-serving-cert\") pod \"controller-manager-d856f7db-zpnrp\" (UID: \"68542d0e-8ec2-4509-b5b2-2ec3d9a0de83\") " pod="openshift-controller-manager/controller-manager-d856f7db-zpnrp" Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.910027 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/68542d0e-8ec2-4509-b5b2-2ec3d9a0de83-client-ca\") pod \"controller-manager-d856f7db-zpnrp\" (UID: \"68542d0e-8ec2-4509-b5b2-2ec3d9a0de83\") " pod="openshift-controller-manager/controller-manager-d856f7db-zpnrp" Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.910234 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bmr2l\" (UID: \"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8\") " pod="openshift-image-registry/image-registry-697d97f7c8-bmr2l" Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.910426 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwjzt\" (UniqueName: \"kubernetes.io/projected/1925aa4a-90ca-4879-8030-1809fec30eb7-kube-api-access-hwjzt\") pod \"route-controller-manager-5b56f5b4c8-c4lgc\" (UID: \"1925aa4a-90ca-4879-8030-1809fec30eb7\") " pod="openshift-route-controller-manager/route-controller-manager-5b56f5b4c8-c4lgc" Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.910458 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68542d0e-8ec2-4509-b5b2-2ec3d9a0de83-config\") pod \"controller-manager-d856f7db-zpnrp\" (UID: \"68542d0e-8ec2-4509-b5b2-2ec3d9a0de83\") " pod="openshift-controller-manager/controller-manager-d856f7db-zpnrp" Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.910971 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1925aa4a-90ca-4879-8030-1809fec30eb7-client-ca\") pod \"route-controller-manager-5b56f5b4c8-c4lgc\" (UID: \"1925aa4a-90ca-4879-8030-1809fec30eb7\") " pod="openshift-route-controller-manager/route-controller-manager-5b56f5b4c8-c4lgc" Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.911084 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1925aa4a-90ca-4879-8030-1809fec30eb7-config\") pod \"route-controller-manager-5b56f5b4c8-c4lgc\" (UID: \"1925aa4a-90ca-4879-8030-1809fec30eb7\") " pod="openshift-route-controller-manager/route-controller-manager-5b56f5b4c8-c4lgc" Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.913806 4698 patch_prober.go:28] interesting pod/router-default-5444994796-trk7h container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 24 10:20:11 crc kubenswrapper[4698]: [-]has-synced failed: reason withheld Feb 24 10:20:11 crc kubenswrapper[4698]: [+]process-running ok Feb 24 10:20:11 crc kubenswrapper[4698]: healthz check failed Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.913860 4698 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-trk7h" podUID="b44b8c72-3ca2-4fbe-aa3f-9ab7917b1658" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.914412 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1925aa4a-90ca-4879-8030-1809fec30eb7-serving-cert\") pod \"route-controller-manager-5b56f5b4c8-c4lgc\" (UID: \"1925aa4a-90ca-4879-8030-1809fec30eb7\") " pod="openshift-route-controller-manager/route-controller-manager-5b56f5b4c8-c4lgc" Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.917865 4698 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.917893 4698 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bmr2l\" (UID: \"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-bmr2l" Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.933705 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwjzt\" (UniqueName: \"kubernetes.io/projected/1925aa4a-90ca-4879-8030-1809fec30eb7-kube-api-access-hwjzt\") pod \"route-controller-manager-5b56f5b4c8-c4lgc\" (UID: \"1925aa4a-90ca-4879-8030-1809fec30eb7\") " pod="openshift-route-controller-manager/route-controller-manager-5b56f5b4c8-c4lgc" Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.947460 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bmr2l\" (UID: \"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8\") " pod="openshift-image-registry/image-registry-697d97f7c8-bmr2l" Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.973856 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 24 10:20:11 crc kubenswrapper[4698]: I0224 10:20:11.978481 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-bmr2l" Feb 24 10:20:12 crc kubenswrapper[4698]: I0224 10:20:12.011677 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 24 10:20:12 crc kubenswrapper[4698]: I0224 10:20:12.011961 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68542d0e-8ec2-4509-b5b2-2ec3d9a0de83-config\") pod \"controller-manager-d856f7db-zpnrp\" (UID: \"68542d0e-8ec2-4509-b5b2-2ec3d9a0de83\") " pod="openshift-controller-manager/controller-manager-d856f7db-zpnrp" Feb 24 10:20:12 crc kubenswrapper[4698]: I0224 10:20:12.012035 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqbnn\" (UniqueName: \"kubernetes.io/projected/68542d0e-8ec2-4509-b5b2-2ec3d9a0de83-kube-api-access-qqbnn\") pod \"controller-manager-d856f7db-zpnrp\" (UID: \"68542d0e-8ec2-4509-b5b2-2ec3d9a0de83\") " pod="openshift-controller-manager/controller-manager-d856f7db-zpnrp" Feb 24 10:20:12 crc kubenswrapper[4698]: I0224 10:20:12.012108 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/68542d0e-8ec2-4509-b5b2-2ec3d9a0de83-proxy-ca-bundles\") pod \"controller-manager-d856f7db-zpnrp\" (UID: \"68542d0e-8ec2-4509-b5b2-2ec3d9a0de83\") " pod="openshift-controller-manager/controller-manager-d856f7db-zpnrp" Feb 24 10:20:12 crc kubenswrapper[4698]: I0224 10:20:12.012143 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/68542d0e-8ec2-4509-b5b2-2ec3d9a0de83-serving-cert\") pod \"controller-manager-d856f7db-zpnrp\" (UID: \"68542d0e-8ec2-4509-b5b2-2ec3d9a0de83\") " pod="openshift-controller-manager/controller-manager-d856f7db-zpnrp" Feb 24 10:20:12 crc kubenswrapper[4698]: I0224 10:20:12.012165 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/68542d0e-8ec2-4509-b5b2-2ec3d9a0de83-client-ca\") pod \"controller-manager-d856f7db-zpnrp\" (UID: \"68542d0e-8ec2-4509-b5b2-2ec3d9a0de83\") " pod="openshift-controller-manager/controller-manager-d856f7db-zpnrp" Feb 24 10:20:12 crc kubenswrapper[4698]: I0224 10:20:12.013432 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/68542d0e-8ec2-4509-b5b2-2ec3d9a0de83-client-ca\") pod \"controller-manager-d856f7db-zpnrp\" (UID: \"68542d0e-8ec2-4509-b5b2-2ec3d9a0de83\") " pod="openshift-controller-manager/controller-manager-d856f7db-zpnrp" Feb 24 10:20:12 crc kubenswrapper[4698]: I0224 10:20:12.014099 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/68542d0e-8ec2-4509-b5b2-2ec3d9a0de83-proxy-ca-bundles\") pod \"controller-manager-d856f7db-zpnrp\" (UID: \"68542d0e-8ec2-4509-b5b2-2ec3d9a0de83\") " pod="openshift-controller-manager/controller-manager-d856f7db-zpnrp" Feb 24 10:20:12 crc kubenswrapper[4698]: I0224 10:20:12.015652 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68542d0e-8ec2-4509-b5b2-2ec3d9a0de83-config\") pod \"controller-manager-d856f7db-zpnrp\" (UID: \"68542d0e-8ec2-4509-b5b2-2ec3d9a0de83\") " pod="openshift-controller-manager/controller-manager-d856f7db-zpnrp" Feb 24 10:20:12 crc kubenswrapper[4698]: I0224 10:20:12.018119 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/68542d0e-8ec2-4509-b5b2-2ec3d9a0de83-serving-cert\") pod \"controller-manager-d856f7db-zpnrp\" (UID: \"68542d0e-8ec2-4509-b5b2-2ec3d9a0de83\") " pod="openshift-controller-manager/controller-manager-d856f7db-zpnrp" Feb 24 10:20:12 crc kubenswrapper[4698]: I0224 10:20:12.026103 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 24 10:20:12 crc kubenswrapper[4698]: I0224 10:20:12.029947 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqbnn\" (UniqueName: \"kubernetes.io/projected/68542d0e-8ec2-4509-b5b2-2ec3d9a0de83-kube-api-access-qqbnn\") pod \"controller-manager-d856f7db-zpnrp\" (UID: \"68542d0e-8ec2-4509-b5b2-2ec3d9a0de83\") " pod="openshift-controller-manager/controller-manager-d856f7db-zpnrp" Feb 24 10:20:12 crc kubenswrapper[4698]: I0224 10:20:12.141111 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5b56f5b4c8-c4lgc" Feb 24 10:20:12 crc kubenswrapper[4698]: I0224 10:20:12.163637 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-d856f7db-zpnrp" Feb 24 10:20:12 crc kubenswrapper[4698]: I0224 10:20:12.192601 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 24 10:20:12 crc kubenswrapper[4698]: I0224 10:20:12.267794 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-bmr2l"] Feb 24 10:20:12 crc kubenswrapper[4698]: I0224 10:20:12.369750 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b56f5b4c8-c4lgc"] Feb 24 10:20:12 crc kubenswrapper[4698]: W0224 10:20:12.381319 4698 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1925aa4a_90ca_4879_8030_1809fec30eb7.slice/crio-5babffb38530a77ca75c674ea7727e6b8045c76ad9c33402d1ad8434232809c4 WatchSource:0}: Error finding container 5babffb38530a77ca75c674ea7727e6b8045c76ad9c33402d1ad8434232809c4: Status 404 returned error can't find the container with id 5babffb38530a77ca75c674ea7727e6b8045c76ad9c33402d1ad8434232809c4 Feb 24 10:20:12 crc kubenswrapper[4698]: I0224 10:20:12.412396 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-d856f7db-zpnrp"] Feb 24 10:20:12 crc kubenswrapper[4698]: W0224 10:20:12.420651 4698 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68542d0e_8ec2_4509_b5b2_2ec3d9a0de83.slice/crio-42f2667a706c5bb23756d19d2e336fa403d8d55795f8059d6325bb540e14b49a WatchSource:0}: Error finding container 42f2667a706c5bb23756d19d2e336fa403d8d55795f8059d6325bb540e14b49a: Status 404 returned error can't find the container with id 42f2667a706c5bb23756d19d2e336fa403d8d55795f8059d6325bb540e14b49a Feb 24 10:20:12 crc kubenswrapper[4698]: I0224 10:20:12.595608 4698 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 24 10:20:12 crc kubenswrapper[4698]: I0224 10:20:12.596282 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 24 10:20:12 crc kubenswrapper[4698]: I0224 10:20:12.603890 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Feb 24 10:20:12 crc kubenswrapper[4698]: I0224 10:20:12.604319 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Feb 24 10:20:12 crc kubenswrapper[4698]: I0224 10:20:12.612345 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 24 10:20:12 crc kubenswrapper[4698]: I0224 10:20:12.619530 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/155e4622-3a8e-433c-a997-147c918dc08c-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"155e4622-3a8e-433c-a997-147c918dc08c\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 24 10:20:12 crc kubenswrapper[4698]: I0224 10:20:12.619586 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/155e4622-3a8e-433c-a997-147c918dc08c-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"155e4622-3a8e-433c-a997-147c918dc08c\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 24 10:20:12 crc kubenswrapper[4698]: I0224 10:20:12.664163 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-bmr2l" event={"ID":"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8","Type":"ContainerStarted","Data":"dbe5d65c1a1d317becc5cb56edfb4174efe2e60d6ae5614685e4bda391a02b7f"} Feb 24 10:20:12 crc kubenswrapper[4698]: I0224 10:20:12.664204 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-bmr2l" event={"ID":"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8","Type":"ContainerStarted","Data":"cd4d43d6cfa657e787a7689307d1aa88f14ab135f434fe79bab08c643b2f70c4"} Feb 24 10:20:12 crc kubenswrapper[4698]: I0224 10:20:12.665124 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-bmr2l" Feb 24 10:20:12 crc kubenswrapper[4698]: I0224 10:20:12.673075 4698 generic.go:334] "Generic (PLEG): container finished" podID="d16dadf6-b01e-4bda-b24b-d63801c9bf23" containerID="f4ea41c11e12dde4995e8ca0a2a489a9cb99bd209c4ec1ebeab649f890485aaf" exitCode=0 Feb 24 10:20:12 crc kubenswrapper[4698]: I0224 10:20:12.673145 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-59zd2" event={"ID":"d16dadf6-b01e-4bda-b24b-d63801c9bf23","Type":"ContainerDied","Data":"f4ea41c11e12dde4995e8ca0a2a489a9cb99bd209c4ec1ebeab649f890485aaf"} Feb 24 10:20:12 crc kubenswrapper[4698]: I0224 10:20:12.673176 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-59zd2" event={"ID":"d16dadf6-b01e-4bda-b24b-d63801c9bf23","Type":"ContainerStarted","Data":"70733949ca29acd9ad2f6c278baae0692c8eea2e136ecf93a964f64c5c7c0edd"} Feb 24 10:20:12 crc kubenswrapper[4698]: I0224 10:20:12.674882 4698 generic.go:334] "Generic (PLEG): container finished" podID="5149fd4f-19d7-4852-b09a-d9909b8231dd" containerID="6a7cd93f6b1dfe1bca4fa90156b6ee31171e9e134143bc5f66aae4f55b363fc0" exitCode=0 Feb 24 10:20:12 crc kubenswrapper[4698]: I0224 10:20:12.674961 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2z8bk" event={"ID":"5149fd4f-19d7-4852-b09a-d9909b8231dd","Type":"ContainerDied","Data":"6a7cd93f6b1dfe1bca4fa90156b6ee31171e9e134143bc5f66aae4f55b363fc0"} Feb 24 10:20:12 crc kubenswrapper[4698]: I0224 10:20:12.674985 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2z8bk" event={"ID":"5149fd4f-19d7-4852-b09a-d9909b8231dd","Type":"ContainerStarted","Data":"ae52dc38bccc1f74a86c16be9229ddbe43c140c2e39b5f1f23bcb0f790b206d8"} Feb 24 10:20:12 crc kubenswrapper[4698]: I0224 10:20:12.686861 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"80fe6957-b898-4120-8959-a0e840e8c4f5","Type":"ContainerStarted","Data":"42955b298af7b6a059fba2e728523305574efbfe2d82a1a0bcb79d49b69fe294"} Feb 24 10:20:12 crc kubenswrapper[4698]: I0224 10:20:12.686905 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"80fe6957-b898-4120-8959-a0e840e8c4f5","Type":"ContainerStarted","Data":"8ef10187319d9f6ea9dde4eef8606a0900ef477e3636d80531f8566a78cf64d6"} Feb 24 10:20:12 crc kubenswrapper[4698]: I0224 10:20:12.687308 4698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-bmr2l" podStartSLOduration=194.687295246 podStartE2EDuration="3m14.687295246s" podCreationTimestamp="2026-02-24 10:16:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:20:12.686535979 +0000 UTC m=+237.800150220" watchObservedRunningTime="2026-02-24 10:20:12.687295246 +0000 UTC m=+237.800909507" Feb 24 10:20:12 crc kubenswrapper[4698]: I0224 10:20:12.693975 4698 generic.go:334] "Generic (PLEG): container finished" podID="97c3a4a8-9e33-4012-9b16-5a0de6e0ace9" containerID="30070b246afa44ae2e4a30ef07801b0a6c263138ff561a7d5617b9095de13e61" exitCode=0 Feb 24 10:20:12 crc kubenswrapper[4698]: I0224 10:20:12.694063 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29532135-2wtjd" event={"ID":"97c3a4a8-9e33-4012-9b16-5a0de6e0ace9","Type":"ContainerDied","Data":"30070b246afa44ae2e4a30ef07801b0a6c263138ff561a7d5617b9095de13e61"} Feb 24 10:20:12 crc kubenswrapper[4698]: I0224 10:20:12.699376 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-d856f7db-zpnrp" event={"ID":"68542d0e-8ec2-4509-b5b2-2ec3d9a0de83","Type":"ContainerStarted","Data":"8508a32fb34c581a4e249c2347b593ee7a48baac6c6f739d0875a1bf90267655"} Feb 24 10:20:12 crc kubenswrapper[4698]: I0224 10:20:12.699432 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-d856f7db-zpnrp" event={"ID":"68542d0e-8ec2-4509-b5b2-2ec3d9a0de83","Type":"ContainerStarted","Data":"42f2667a706c5bb23756d19d2e336fa403d8d55795f8059d6325bb540e14b49a"} Feb 24 10:20:12 crc kubenswrapper[4698]: I0224 10:20:12.699484 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-d856f7db-zpnrp" Feb 24 10:20:12 crc kubenswrapper[4698]: I0224 10:20:12.702617 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5b56f5b4c8-c4lgc" event={"ID":"1925aa4a-90ca-4879-8030-1809fec30eb7","Type":"ContainerStarted","Data":"0cb0c35e7654e88f01475f7d4e84f8f797f5e57119b632e704072c3b4092efd4"} Feb 24 10:20:12 crc kubenswrapper[4698]: I0224 10:20:12.702653 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5b56f5b4c8-c4lgc" event={"ID":"1925aa4a-90ca-4879-8030-1809fec30eb7","Type":"ContainerStarted","Data":"5babffb38530a77ca75c674ea7727e6b8045c76ad9c33402d1ad8434232809c4"} Feb 24 10:20:12 crc kubenswrapper[4698]: I0224 10:20:12.703063 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5b56f5b4c8-c4lgc" Feb 24 10:20:12 crc kubenswrapper[4698]: I0224 10:20:12.708862 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-d856f7db-zpnrp" Feb 24 10:20:12 crc kubenswrapper[4698]: I0224 10:20:12.720960 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/155e4622-3a8e-433c-a997-147c918dc08c-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"155e4622-3a8e-433c-a997-147c918dc08c\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 24 10:20:12 crc kubenswrapper[4698]: I0224 10:20:12.721212 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/155e4622-3a8e-433c-a997-147c918dc08c-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"155e4622-3a8e-433c-a997-147c918dc08c\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 24 10:20:12 crc kubenswrapper[4698]: I0224 10:20:12.721559 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/155e4622-3a8e-433c-a997-147c918dc08c-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"155e4622-3a8e-433c-a997-147c918dc08c\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 24 10:20:12 crc kubenswrapper[4698]: I0224 10:20:12.773743 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/155e4622-3a8e-433c-a997-147c918dc08c-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"155e4622-3a8e-433c-a997-147c918dc08c\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 24 10:20:12 crc kubenswrapper[4698]: I0224 10:20:12.785893 4698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=1.785879767 podStartE2EDuration="1.785879767s" podCreationTimestamp="2026-02-24 10:20:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:20:12.783287167 +0000 UTC m=+237.896901398" watchObservedRunningTime="2026-02-24 10:20:12.785879767 +0000 UTC m=+237.899494008" Feb 24 10:20:12 crc kubenswrapper[4698]: I0224 10:20:12.817516 4698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5b56f5b4c8-c4lgc" podStartSLOduration=2.817484101 podStartE2EDuration="2.817484101s" podCreationTimestamp="2026-02-24 10:20:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:20:12.81745496 +0000 UTC m=+237.931069191" watchObservedRunningTime="2026-02-24 10:20:12.817484101 +0000 UTC m=+237.931098342" Feb 24 10:20:12 crc kubenswrapper[4698]: I0224 10:20:12.834085 4698 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-p9jbm"] Feb 24 10:20:12 crc kubenswrapper[4698]: I0224 10:20:12.835068 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p9jbm" Feb 24 10:20:12 crc kubenswrapper[4698]: I0224 10:20:12.844730 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 24 10:20:12 crc kubenswrapper[4698]: I0224 10:20:12.857000 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-p9jbm"] Feb 24 10:20:12 crc kubenswrapper[4698]: I0224 10:20:12.896527 4698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-d856f7db-zpnrp" podStartSLOduration=2.896510648 podStartE2EDuration="2.896510648s" podCreationTimestamp="2026-02-24 10:20:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:20:12.876582524 +0000 UTC m=+237.990196775" watchObservedRunningTime="2026-02-24 10:20:12.896510648 +0000 UTC m=+238.010124879" Feb 24 10:20:12 crc kubenswrapper[4698]: I0224 10:20:12.912704 4698 patch_prober.go:28] interesting pod/router-default-5444994796-trk7h container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 24 10:20:12 crc kubenswrapper[4698]: [-]has-synced failed: reason withheld Feb 24 10:20:12 crc kubenswrapper[4698]: [+]process-running ok Feb 24 10:20:12 crc kubenswrapper[4698]: healthz check failed Feb 24 10:20:12 crc kubenswrapper[4698]: I0224 10:20:12.912758 4698 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-trk7h" podUID="b44b8c72-3ca2-4fbe-aa3f-9ab7917b1658" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 24 10:20:12 crc kubenswrapper[4698]: I0224 10:20:12.915388 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 24 10:20:12 crc kubenswrapper[4698]: I0224 10:20:12.927601 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25f4eaf1-6171-44dd-b225-be712a45ba1b-catalog-content\") pod \"redhat-marketplace-p9jbm\" (UID: \"25f4eaf1-6171-44dd-b225-be712a45ba1b\") " pod="openshift-marketplace/redhat-marketplace-p9jbm" Feb 24 10:20:12 crc kubenswrapper[4698]: I0224 10:20:12.927640 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mq2ws\" (UniqueName: \"kubernetes.io/projected/25f4eaf1-6171-44dd-b225-be712a45ba1b-kube-api-access-mq2ws\") pod \"redhat-marketplace-p9jbm\" (UID: \"25f4eaf1-6171-44dd-b225-be712a45ba1b\") " pod="openshift-marketplace/redhat-marketplace-p9jbm" Feb 24 10:20:12 crc kubenswrapper[4698]: I0224 10:20:12.927707 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25f4eaf1-6171-44dd-b225-be712a45ba1b-utilities\") pod \"redhat-marketplace-p9jbm\" (UID: \"25f4eaf1-6171-44dd-b225-be712a45ba1b\") " pod="openshift-marketplace/redhat-marketplace-p9jbm" Feb 24 10:20:12 crc kubenswrapper[4698]: I0224 10:20:12.988597 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5b56f5b4c8-c4lgc" Feb 24 10:20:13 crc kubenswrapper[4698]: I0224 10:20:13.032815 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25f4eaf1-6171-44dd-b225-be712a45ba1b-catalog-content\") pod \"redhat-marketplace-p9jbm\" (UID: \"25f4eaf1-6171-44dd-b225-be712a45ba1b\") " pod="openshift-marketplace/redhat-marketplace-p9jbm" Feb 24 10:20:13 crc kubenswrapper[4698]: I0224 10:20:13.032862 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mq2ws\" (UniqueName: \"kubernetes.io/projected/25f4eaf1-6171-44dd-b225-be712a45ba1b-kube-api-access-mq2ws\") pod \"redhat-marketplace-p9jbm\" (UID: \"25f4eaf1-6171-44dd-b225-be712a45ba1b\") " pod="openshift-marketplace/redhat-marketplace-p9jbm" Feb 24 10:20:13 crc kubenswrapper[4698]: I0224 10:20:13.032929 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25f4eaf1-6171-44dd-b225-be712a45ba1b-utilities\") pod \"redhat-marketplace-p9jbm\" (UID: \"25f4eaf1-6171-44dd-b225-be712a45ba1b\") " pod="openshift-marketplace/redhat-marketplace-p9jbm" Feb 24 10:20:13 crc kubenswrapper[4698]: I0224 10:20:13.033588 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25f4eaf1-6171-44dd-b225-be712a45ba1b-catalog-content\") pod \"redhat-marketplace-p9jbm\" (UID: \"25f4eaf1-6171-44dd-b225-be712a45ba1b\") " pod="openshift-marketplace/redhat-marketplace-p9jbm" Feb 24 10:20:13 crc kubenswrapper[4698]: I0224 10:20:13.033717 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25f4eaf1-6171-44dd-b225-be712a45ba1b-utilities\") pod \"redhat-marketplace-p9jbm\" (UID: \"25f4eaf1-6171-44dd-b225-be712a45ba1b\") " pod="openshift-marketplace/redhat-marketplace-p9jbm" Feb 24 10:20:13 crc kubenswrapper[4698]: I0224 10:20:13.068058 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mq2ws\" (UniqueName: \"kubernetes.io/projected/25f4eaf1-6171-44dd-b225-be712a45ba1b-kube-api-access-mq2ws\") pod \"redhat-marketplace-p9jbm\" (UID: \"25f4eaf1-6171-44dd-b225-be712a45ba1b\") " pod="openshift-marketplace/redhat-marketplace-p9jbm" Feb 24 10:20:13 crc kubenswrapper[4698]: I0224 10:20:13.157875 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p9jbm" Feb 24 10:20:13 crc kubenswrapper[4698]: I0224 10:20:13.180651 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 24 10:20:13 crc kubenswrapper[4698]: I0224 10:20:13.223799 4698 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-dh74l"] Feb 24 10:20:13 crc kubenswrapper[4698]: I0224 10:20:13.224877 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dh74l" Feb 24 10:20:13 crc kubenswrapper[4698]: I0224 10:20:13.240053 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dh74l"] Feb 24 10:20:13 crc kubenswrapper[4698]: I0224 10:20:13.250661 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-qzmkf" Feb 24 10:20:13 crc kubenswrapper[4698]: I0224 10:20:13.342222 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjsf5\" (UniqueName: \"kubernetes.io/projected/6f1af873-5e8f-4f75-81c2-c9b26ee37f2a-kube-api-access-jjsf5\") pod \"redhat-marketplace-dh74l\" (UID: \"6f1af873-5e8f-4f75-81c2-c9b26ee37f2a\") " pod="openshift-marketplace/redhat-marketplace-dh74l" Feb 24 10:20:13 crc kubenswrapper[4698]: I0224 10:20:13.346114 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f1af873-5e8f-4f75-81c2-c9b26ee37f2a-utilities\") pod \"redhat-marketplace-dh74l\" (UID: \"6f1af873-5e8f-4f75-81c2-c9b26ee37f2a\") " pod="openshift-marketplace/redhat-marketplace-dh74l" Feb 24 10:20:13 crc kubenswrapper[4698]: I0224 10:20:13.351296 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f1af873-5e8f-4f75-81c2-c9b26ee37f2a-catalog-content\") pod \"redhat-marketplace-dh74l\" (UID: \"6f1af873-5e8f-4f75-81c2-c9b26ee37f2a\") " pod="openshift-marketplace/redhat-marketplace-dh74l" Feb 24 10:20:13 crc kubenswrapper[4698]: I0224 10:20:13.418548 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-hxxxs" Feb 24 10:20:13 crc kubenswrapper[4698]: I0224 10:20:13.453421 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f1af873-5e8f-4f75-81c2-c9b26ee37f2a-utilities\") pod \"redhat-marketplace-dh74l\" (UID: \"6f1af873-5e8f-4f75-81c2-c9b26ee37f2a\") " pod="openshift-marketplace/redhat-marketplace-dh74l" Feb 24 10:20:13 crc kubenswrapper[4698]: I0224 10:20:13.453454 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f1af873-5e8f-4f75-81c2-c9b26ee37f2a-catalog-content\") pod \"redhat-marketplace-dh74l\" (UID: \"6f1af873-5e8f-4f75-81c2-c9b26ee37f2a\") " pod="openshift-marketplace/redhat-marketplace-dh74l" Feb 24 10:20:13 crc kubenswrapper[4698]: I0224 10:20:13.453479 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/17a1338b-6385-4795-9397-74316d6599d9-metrics-certs\") pod \"network-metrics-daemon-rpnnm\" (UID: \"17a1338b-6385-4795-9397-74316d6599d9\") " pod="openshift-multus/network-metrics-daemon-rpnnm" Feb 24 10:20:13 crc kubenswrapper[4698]: I0224 10:20:13.453501 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjsf5\" (UniqueName: \"kubernetes.io/projected/6f1af873-5e8f-4f75-81c2-c9b26ee37f2a-kube-api-access-jjsf5\") pod \"redhat-marketplace-dh74l\" (UID: \"6f1af873-5e8f-4f75-81c2-c9b26ee37f2a\") " pod="openshift-marketplace/redhat-marketplace-dh74l" Feb 24 10:20:13 crc kubenswrapper[4698]: I0224 10:20:13.454243 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f1af873-5e8f-4f75-81c2-c9b26ee37f2a-utilities\") pod \"redhat-marketplace-dh74l\" (UID: \"6f1af873-5e8f-4f75-81c2-c9b26ee37f2a\") " pod="openshift-marketplace/redhat-marketplace-dh74l" Feb 24 10:20:13 crc kubenswrapper[4698]: I0224 10:20:13.454505 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f1af873-5e8f-4f75-81c2-c9b26ee37f2a-catalog-content\") pod \"redhat-marketplace-dh74l\" (UID: \"6f1af873-5e8f-4f75-81c2-c9b26ee37f2a\") " pod="openshift-marketplace/redhat-marketplace-dh74l" Feb 24 10:20:13 crc kubenswrapper[4698]: I0224 10:20:13.458589 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/17a1338b-6385-4795-9397-74316d6599d9-metrics-certs\") pod \"network-metrics-daemon-rpnnm\" (UID: \"17a1338b-6385-4795-9397-74316d6599d9\") " pod="openshift-multus/network-metrics-daemon-rpnnm" Feb 24 10:20:13 crc kubenswrapper[4698]: I0224 10:20:13.489708 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjsf5\" (UniqueName: \"kubernetes.io/projected/6f1af873-5e8f-4f75-81c2-c9b26ee37f2a-kube-api-access-jjsf5\") pod \"redhat-marketplace-dh74l\" (UID: \"6f1af873-5e8f-4f75-81c2-c9b26ee37f2a\") " pod="openshift-marketplace/redhat-marketplace-dh74l" Feb 24 10:20:13 crc kubenswrapper[4698]: I0224 10:20:13.520500 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-lqzfp" Feb 24 10:20:13 crc kubenswrapper[4698]: I0224 10:20:13.520542 4698 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-lqzfp" Feb 24 10:20:13 crc kubenswrapper[4698]: I0224 10:20:13.531751 4698 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-lqzfp" Feb 24 10:20:13 crc kubenswrapper[4698]: I0224 10:20:13.532637 4698 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-clwmh" Feb 24 10:20:13 crc kubenswrapper[4698]: I0224 10:20:13.532656 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-clwmh" Feb 24 10:20:13 crc kubenswrapper[4698]: I0224 10:20:13.536643 4698 patch_prober.go:28] interesting pod/console-f9d7485db-clwmh container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Feb 24 10:20:13 crc kubenswrapper[4698]: I0224 10:20:13.536670 4698 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-clwmh" podUID="348d0b48-f2a9-4326-b8c8-88f43029f382" containerName="console" probeResult="failure" output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" Feb 24 10:20:13 crc kubenswrapper[4698]: I0224 10:20:13.586982 4698 patch_prober.go:28] interesting pod/downloads-7954f5f757-z42jf container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" start-of-body= Feb 24 10:20:13 crc kubenswrapper[4698]: I0224 10:20:13.587028 4698 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-z42jf" podUID="108d72f5-0dd9-4965-a41f-7403ad8fce04" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" Feb 24 10:20:13 crc kubenswrapper[4698]: I0224 10:20:13.593977 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-p9jbm"] Feb 24 10:20:13 crc kubenswrapper[4698]: I0224 10:20:13.598019 4698 patch_prober.go:28] interesting pod/downloads-7954f5f757-z42jf container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" start-of-body= Feb 24 10:20:13 crc kubenswrapper[4698]: I0224 10:20:13.598054 4698 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-z42jf" podUID="108d72f5-0dd9-4965-a41f-7403ad8fce04" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" Feb 24 10:20:13 crc kubenswrapper[4698]: I0224 10:20:13.630159 4698 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Feb 24 10:20:13 crc kubenswrapper[4698]: W0224 10:20:13.633994 4698 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25f4eaf1_6171_44dd_b225_be712a45ba1b.slice/crio-741cf2384aadeabb0a41bac00c46f80c21327dca34056e3677aaefbac4200336 WatchSource:0}: Error finding container 741cf2384aadeabb0a41bac00c46f80c21327dca34056e3677aaefbac4200336: Status 404 returned error can't find the container with id 741cf2384aadeabb0a41bac00c46f80c21327dca34056e3677aaefbac4200336 Feb 24 10:20:13 crc kubenswrapper[4698]: I0224 10:20:13.642506 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dh74l" Feb 24 10:20:13 crc kubenswrapper[4698]: I0224 10:20:13.656518 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rpnnm" Feb 24 10:20:13 crc kubenswrapper[4698]: I0224 10:20:13.696153 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-79f62" Feb 24 10:20:13 crc kubenswrapper[4698]: I0224 10:20:13.729001 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"155e4622-3a8e-433c-a997-147c918dc08c","Type":"ContainerStarted","Data":"84f73020f3442863483e8e17893fa4291f1adf5d816c039cb5f1dd334e92908c"} Feb 24 10:20:13 crc kubenswrapper[4698]: I0224 10:20:13.744790 4698 generic.go:334] "Generic (PLEG): container finished" podID="80fe6957-b898-4120-8959-a0e840e8c4f5" containerID="42955b298af7b6a059fba2e728523305574efbfe2d82a1a0bcb79d49b69fe294" exitCode=0 Feb 24 10:20:13 crc kubenswrapper[4698]: I0224 10:20:13.744859 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"80fe6957-b898-4120-8959-a0e840e8c4f5","Type":"ContainerDied","Data":"42955b298af7b6a059fba2e728523305574efbfe2d82a1a0bcb79d49b69fe294"} Feb 24 10:20:13 crc kubenswrapper[4698]: I0224 10:20:13.760084 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p9jbm" event={"ID":"25f4eaf1-6171-44dd-b225-be712a45ba1b","Type":"ContainerStarted","Data":"741cf2384aadeabb0a41bac00c46f80c21327dca34056e3677aaefbac4200336"} Feb 24 10:20:13 crc kubenswrapper[4698]: I0224 10:20:13.774043 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-lqzfp" Feb 24 10:20:13 crc kubenswrapper[4698]: I0224 10:20:13.840100 4698 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ppmk4"] Feb 24 10:20:13 crc kubenswrapper[4698]: I0224 10:20:13.841043 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ppmk4" Feb 24 10:20:13 crc kubenswrapper[4698]: I0224 10:20:13.850219 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 24 10:20:13 crc kubenswrapper[4698]: I0224 10:20:13.871796 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ppmk4"] Feb 24 10:20:13 crc kubenswrapper[4698]: I0224 10:20:13.909367 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-trk7h" Feb 24 10:20:13 crc kubenswrapper[4698]: I0224 10:20:13.917516 4698 patch_prober.go:28] interesting pod/router-default-5444994796-trk7h container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 24 10:20:13 crc kubenswrapper[4698]: [-]has-synced failed: reason withheld Feb 24 10:20:13 crc kubenswrapper[4698]: [+]process-running ok Feb 24 10:20:13 crc kubenswrapper[4698]: healthz check failed Feb 24 10:20:13 crc kubenswrapper[4698]: I0224 10:20:13.917569 4698 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-trk7h" podUID="b44b8c72-3ca2-4fbe-aa3f-9ab7917b1658" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 24 10:20:13 crc kubenswrapper[4698]: I0224 10:20:13.961134 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfcd4\" (UniqueName: \"kubernetes.io/projected/55418747-8c79-496a-9b89-68f9eaa3f01a-kube-api-access-cfcd4\") pod \"redhat-operators-ppmk4\" (UID: \"55418747-8c79-496a-9b89-68f9eaa3f01a\") " pod="openshift-marketplace/redhat-operators-ppmk4" Feb 24 10:20:13 crc kubenswrapper[4698]: I0224 10:20:13.961211 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55418747-8c79-496a-9b89-68f9eaa3f01a-utilities\") pod \"redhat-operators-ppmk4\" (UID: \"55418747-8c79-496a-9b89-68f9eaa3f01a\") " pod="openshift-marketplace/redhat-operators-ppmk4" Feb 24 10:20:13 crc kubenswrapper[4698]: I0224 10:20:13.961229 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55418747-8c79-496a-9b89-68f9eaa3f01a-catalog-content\") pod \"redhat-operators-ppmk4\" (UID: \"55418747-8c79-496a-9b89-68f9eaa3f01a\") " pod="openshift-marketplace/redhat-operators-ppmk4" Feb 24 10:20:14 crc kubenswrapper[4698]: I0224 10:20:14.064470 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfcd4\" (UniqueName: \"kubernetes.io/projected/55418747-8c79-496a-9b89-68f9eaa3f01a-kube-api-access-cfcd4\") pod \"redhat-operators-ppmk4\" (UID: \"55418747-8c79-496a-9b89-68f9eaa3f01a\") " pod="openshift-marketplace/redhat-operators-ppmk4" Feb 24 10:20:14 crc kubenswrapper[4698]: I0224 10:20:14.064630 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55418747-8c79-496a-9b89-68f9eaa3f01a-utilities\") pod \"redhat-operators-ppmk4\" (UID: \"55418747-8c79-496a-9b89-68f9eaa3f01a\") " pod="openshift-marketplace/redhat-operators-ppmk4" Feb 24 10:20:14 crc kubenswrapper[4698]: I0224 10:20:14.064648 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55418747-8c79-496a-9b89-68f9eaa3f01a-catalog-content\") pod \"redhat-operators-ppmk4\" (UID: \"55418747-8c79-496a-9b89-68f9eaa3f01a\") " pod="openshift-marketplace/redhat-operators-ppmk4" Feb 24 10:20:14 crc kubenswrapper[4698]: I0224 10:20:14.065376 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55418747-8c79-496a-9b89-68f9eaa3f01a-catalog-content\") pod \"redhat-operators-ppmk4\" (UID: \"55418747-8c79-496a-9b89-68f9eaa3f01a\") " pod="openshift-marketplace/redhat-operators-ppmk4" Feb 24 10:20:14 crc kubenswrapper[4698]: I0224 10:20:14.069910 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55418747-8c79-496a-9b89-68f9eaa3f01a-utilities\") pod \"redhat-operators-ppmk4\" (UID: \"55418747-8c79-496a-9b89-68f9eaa3f01a\") " pod="openshift-marketplace/redhat-operators-ppmk4" Feb 24 10:20:14 crc kubenswrapper[4698]: I0224 10:20:14.105208 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfcd4\" (UniqueName: \"kubernetes.io/projected/55418747-8c79-496a-9b89-68f9eaa3f01a-kube-api-access-cfcd4\") pod \"redhat-operators-ppmk4\" (UID: \"55418747-8c79-496a-9b89-68f9eaa3f01a\") " pod="openshift-marketplace/redhat-operators-ppmk4" Feb 24 10:20:14 crc kubenswrapper[4698]: I0224 10:20:14.170177 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ppmk4" Feb 24 10:20:14 crc kubenswrapper[4698]: I0224 10:20:14.237015 4698 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-l7zvp"] Feb 24 10:20:14 crc kubenswrapper[4698]: I0224 10:20:14.237974 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l7zvp" Feb 24 10:20:14 crc kubenswrapper[4698]: I0224 10:20:14.263300 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-l7zvp"] Feb 24 10:20:14 crc kubenswrapper[4698]: I0224 10:20:14.270007 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/291ba94f-a9ac-4d5c-8476-221496078d80-utilities\") pod \"redhat-operators-l7zvp\" (UID: \"291ba94f-a9ac-4d5c-8476-221496078d80\") " pod="openshift-marketplace/redhat-operators-l7zvp" Feb 24 10:20:14 crc kubenswrapper[4698]: I0224 10:20:14.270068 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/291ba94f-a9ac-4d5c-8476-221496078d80-catalog-content\") pod \"redhat-operators-l7zvp\" (UID: \"291ba94f-a9ac-4d5c-8476-221496078d80\") " pod="openshift-marketplace/redhat-operators-l7zvp" Feb 24 10:20:14 crc kubenswrapper[4698]: I0224 10:20:14.270087 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xn7xk\" (UniqueName: \"kubernetes.io/projected/291ba94f-a9ac-4d5c-8476-221496078d80-kube-api-access-xn7xk\") pod \"redhat-operators-l7zvp\" (UID: \"291ba94f-a9ac-4d5c-8476-221496078d80\") " pod="openshift-marketplace/redhat-operators-l7zvp" Feb 24 10:20:14 crc kubenswrapper[4698]: I0224 10:20:14.333906 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-rpnnm"] Feb 24 10:20:14 crc kubenswrapper[4698]: I0224 10:20:14.360961 4698 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29532135-2wtjd" Feb 24 10:20:14 crc kubenswrapper[4698]: I0224 10:20:14.368920 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dh74l"] Feb 24 10:20:14 crc kubenswrapper[4698]: I0224 10:20:14.374881 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wdt2j\" (UniqueName: \"kubernetes.io/projected/97c3a4a8-9e33-4012-9b16-5a0de6e0ace9-kube-api-access-wdt2j\") pod \"97c3a4a8-9e33-4012-9b16-5a0de6e0ace9\" (UID: \"97c3a4a8-9e33-4012-9b16-5a0de6e0ace9\") " Feb 24 10:20:14 crc kubenswrapper[4698]: I0224 10:20:14.374940 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/97c3a4a8-9e33-4012-9b16-5a0de6e0ace9-secret-volume\") pod \"97c3a4a8-9e33-4012-9b16-5a0de6e0ace9\" (UID: \"97c3a4a8-9e33-4012-9b16-5a0de6e0ace9\") " Feb 24 10:20:14 crc kubenswrapper[4698]: I0224 10:20:14.375015 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/97c3a4a8-9e33-4012-9b16-5a0de6e0ace9-config-volume\") pod \"97c3a4a8-9e33-4012-9b16-5a0de6e0ace9\" (UID: \"97c3a4a8-9e33-4012-9b16-5a0de6e0ace9\") " Feb 24 10:20:14 crc kubenswrapper[4698]: I0224 10:20:14.375176 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/291ba94f-a9ac-4d5c-8476-221496078d80-utilities\") pod \"redhat-operators-l7zvp\" (UID: \"291ba94f-a9ac-4d5c-8476-221496078d80\") " pod="openshift-marketplace/redhat-operators-l7zvp" Feb 24 10:20:14 crc kubenswrapper[4698]: I0224 10:20:14.375228 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/291ba94f-a9ac-4d5c-8476-221496078d80-catalog-content\") pod \"redhat-operators-l7zvp\" (UID: \"291ba94f-a9ac-4d5c-8476-221496078d80\") " pod="openshift-marketplace/redhat-operators-l7zvp" Feb 24 10:20:14 crc kubenswrapper[4698]: I0224 10:20:14.375244 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xn7xk\" (UniqueName: \"kubernetes.io/projected/291ba94f-a9ac-4d5c-8476-221496078d80-kube-api-access-xn7xk\") pod \"redhat-operators-l7zvp\" (UID: \"291ba94f-a9ac-4d5c-8476-221496078d80\") " pod="openshift-marketplace/redhat-operators-l7zvp" Feb 24 10:20:14 crc kubenswrapper[4698]: I0224 10:20:14.378331 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/291ba94f-a9ac-4d5c-8476-221496078d80-utilities\") pod \"redhat-operators-l7zvp\" (UID: \"291ba94f-a9ac-4d5c-8476-221496078d80\") " pod="openshift-marketplace/redhat-operators-l7zvp" Feb 24 10:20:14 crc kubenswrapper[4698]: I0224 10:20:14.386668 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97c3a4a8-9e33-4012-9b16-5a0de6e0ace9-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "97c3a4a8-9e33-4012-9b16-5a0de6e0ace9" (UID: "97c3a4a8-9e33-4012-9b16-5a0de6e0ace9"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:20:14 crc kubenswrapper[4698]: I0224 10:20:14.386833 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/291ba94f-a9ac-4d5c-8476-221496078d80-catalog-content\") pod \"redhat-operators-l7zvp\" (UID: \"291ba94f-a9ac-4d5c-8476-221496078d80\") " pod="openshift-marketplace/redhat-operators-l7zvp" Feb 24 10:20:14 crc kubenswrapper[4698]: I0224 10:20:14.387016 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97c3a4a8-9e33-4012-9b16-5a0de6e0ace9-kube-api-access-wdt2j" (OuterVolumeSpecName: "kube-api-access-wdt2j") pod "97c3a4a8-9e33-4012-9b16-5a0de6e0ace9" (UID: "97c3a4a8-9e33-4012-9b16-5a0de6e0ace9"). InnerVolumeSpecName "kube-api-access-wdt2j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:20:14 crc kubenswrapper[4698]: I0224 10:20:14.387247 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97c3a4a8-9e33-4012-9b16-5a0de6e0ace9-config-volume" (OuterVolumeSpecName: "config-volume") pod "97c3a4a8-9e33-4012-9b16-5a0de6e0ace9" (UID: "97c3a4a8-9e33-4012-9b16-5a0de6e0ace9"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:20:14 crc kubenswrapper[4698]: W0224 10:20:14.394433 4698 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f1af873_5e8f_4f75_81c2_c9b26ee37f2a.slice/crio-dd26b6fb73cd2b0977541851f09e591a0eef46bc9736ba4dae9a64dafab09efa WatchSource:0}: Error finding container dd26b6fb73cd2b0977541851f09e591a0eef46bc9736ba4dae9a64dafab09efa: Status 404 returned error can't find the container with id dd26b6fb73cd2b0977541851f09e591a0eef46bc9736ba4dae9a64dafab09efa Feb 24 10:20:14 crc kubenswrapper[4698]: I0224 10:20:14.399148 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xn7xk\" (UniqueName: \"kubernetes.io/projected/291ba94f-a9ac-4d5c-8476-221496078d80-kube-api-access-xn7xk\") pod \"redhat-operators-l7zvp\" (UID: \"291ba94f-a9ac-4d5c-8476-221496078d80\") " pod="openshift-marketplace/redhat-operators-l7zvp" Feb 24 10:20:14 crc kubenswrapper[4698]: I0224 10:20:14.477167 4698 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wdt2j\" (UniqueName: \"kubernetes.io/projected/97c3a4a8-9e33-4012-9b16-5a0de6e0ace9-kube-api-access-wdt2j\") on node \"crc\" DevicePath \"\"" Feb 24 10:20:14 crc kubenswrapper[4698]: I0224 10:20:14.477214 4698 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/97c3a4a8-9e33-4012-9b16-5a0de6e0ace9-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 24 10:20:14 crc kubenswrapper[4698]: I0224 10:20:14.477225 4698 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/97c3a4a8-9e33-4012-9b16-5a0de6e0ace9-config-volume\") on node \"crc\" DevicePath \"\"" Feb 24 10:20:14 crc kubenswrapper[4698]: I0224 10:20:14.598433 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l7zvp" Feb 24 10:20:14 crc kubenswrapper[4698]: I0224 10:20:14.770354 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dh74l" event={"ID":"6f1af873-5e8f-4f75-81c2-c9b26ee37f2a","Type":"ContainerStarted","Data":"dd26b6fb73cd2b0977541851f09e591a0eef46bc9736ba4dae9a64dafab09efa"} Feb 24 10:20:14 crc kubenswrapper[4698]: I0224 10:20:14.783191 4698 generic.go:334] "Generic (PLEG): container finished" podID="25f4eaf1-6171-44dd-b225-be712a45ba1b" containerID="c8fbd1359619ee5da259f2643ad462ed01f041a307bbb27fcbe122d7565a7094" exitCode=0 Feb 24 10:20:14 crc kubenswrapper[4698]: I0224 10:20:14.783288 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p9jbm" event={"ID":"25f4eaf1-6171-44dd-b225-be712a45ba1b","Type":"ContainerDied","Data":"c8fbd1359619ee5da259f2643ad462ed01f041a307bbb27fcbe122d7565a7094"} Feb 24 10:20:14 crc kubenswrapper[4698]: I0224 10:20:14.791744 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-rpnnm" event={"ID":"17a1338b-6385-4795-9397-74316d6599d9","Type":"ContainerStarted","Data":"c2f3f9ecfdd8e6da9dd0f0432af3e639504f95a1f242b34893c716798faf6782"} Feb 24 10:20:14 crc kubenswrapper[4698]: I0224 10:20:14.796780 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"155e4622-3a8e-433c-a997-147c918dc08c","Type":"ContainerStarted","Data":"48f0e5a436d222eba3652f05d969e4446d631ab169947050fb1559bd7327783b"} Feb 24 10:20:14 crc kubenswrapper[4698]: I0224 10:20:14.814786 4698 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29532135-2wtjd" Feb 24 10:20:14 crc kubenswrapper[4698]: I0224 10:20:14.815345 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29532135-2wtjd" event={"ID":"97c3a4a8-9e33-4012-9b16-5a0de6e0ace9","Type":"ContainerDied","Data":"569095d4310b2fb137e5e2990855206a091ce46a80655a67c8377af3042a0199"} Feb 24 10:20:14 crc kubenswrapper[4698]: I0224 10:20:14.815399 4698 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="569095d4310b2fb137e5e2990855206a091ce46a80655a67c8377af3042a0199" Feb 24 10:20:14 crc kubenswrapper[4698]: I0224 10:20:14.831868 4698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.831849461 podStartE2EDuration="2.831849461s" podCreationTimestamp="2026-02-24 10:20:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:20:14.827889139 +0000 UTC m=+239.941503380" watchObservedRunningTime="2026-02-24 10:20:14.831849461 +0000 UTC m=+239.945463702" Feb 24 10:20:14 crc kubenswrapper[4698]: I0224 10:20:14.916014 4698 patch_prober.go:28] interesting pod/router-default-5444994796-trk7h container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 24 10:20:14 crc kubenswrapper[4698]: [-]has-synced failed: reason withheld Feb 24 10:20:14 crc kubenswrapper[4698]: [+]process-running ok Feb 24 10:20:14 crc kubenswrapper[4698]: healthz check failed Feb 24 10:20:14 crc kubenswrapper[4698]: I0224 10:20:14.916258 4698 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-trk7h" podUID="b44b8c72-3ca2-4fbe-aa3f-9ab7917b1658" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 24 10:20:14 crc kubenswrapper[4698]: I0224 10:20:14.945687 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ppmk4"] Feb 24 10:20:15 crc kubenswrapper[4698]: W0224 10:20:15.019196 4698 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55418747_8c79_496a_9b89_68f9eaa3f01a.slice/crio-19df3b359278e17ef43c843a429e1d6694c29b28feddc379809fc373338c52b2 WatchSource:0}: Error finding container 19df3b359278e17ef43c843a429e1d6694c29b28feddc379809fc373338c52b2: Status 404 returned error can't find the container with id 19df3b359278e17ef43c843a429e1d6694c29b28feddc379809fc373338c52b2 Feb 24 10:20:15 crc kubenswrapper[4698]: I0224 10:20:15.210696 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-l7zvp"] Feb 24 10:20:15 crc kubenswrapper[4698]: I0224 10:20:15.221627 4698 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 24 10:20:15 crc kubenswrapper[4698]: I0224 10:20:15.295644 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/80fe6957-b898-4120-8959-a0e840e8c4f5-kubelet-dir\") pod \"80fe6957-b898-4120-8959-a0e840e8c4f5\" (UID: \"80fe6957-b898-4120-8959-a0e840e8c4f5\") " Feb 24 10:20:15 crc kubenswrapper[4698]: I0224 10:20:15.295807 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/80fe6957-b898-4120-8959-a0e840e8c4f5-kube-api-access\") pod \"80fe6957-b898-4120-8959-a0e840e8c4f5\" (UID: \"80fe6957-b898-4120-8959-a0e840e8c4f5\") " Feb 24 10:20:15 crc kubenswrapper[4698]: I0224 10:20:15.296324 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/80fe6957-b898-4120-8959-a0e840e8c4f5-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "80fe6957-b898-4120-8959-a0e840e8c4f5" (UID: "80fe6957-b898-4120-8959-a0e840e8c4f5"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 10:20:15 crc kubenswrapper[4698]: I0224 10:20:15.302126 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80fe6957-b898-4120-8959-a0e840e8c4f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "80fe6957-b898-4120-8959-a0e840e8c4f5" (UID: "80fe6957-b898-4120-8959-a0e840e8c4f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:20:15 crc kubenswrapper[4698]: I0224 10:20:15.397565 4698 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/80fe6957-b898-4120-8959-a0e840e8c4f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 24 10:20:15 crc kubenswrapper[4698]: I0224 10:20:15.397605 4698 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/80fe6957-b898-4120-8959-a0e840e8c4f5-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 24 10:20:15 crc kubenswrapper[4698]: I0224 10:20:15.819931 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dh74l" event={"ID":"6f1af873-5e8f-4f75-81c2-c9b26ee37f2a","Type":"ContainerStarted","Data":"3eac69b7cc348cccf17e267bf0b0792df72bd7ce9641816eebe4c22c26e5ce6d"} Feb 24 10:20:15 crc kubenswrapper[4698]: I0224 10:20:15.821205 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l7zvp" event={"ID":"291ba94f-a9ac-4d5c-8476-221496078d80","Type":"ContainerStarted","Data":"d45dbe795bfedd1e2eeb45e669c0614087eac6502f1b8e64c74ed3c71c9ed10a"} Feb 24 10:20:15 crc kubenswrapper[4698]: I0224 10:20:15.822705 4698 generic.go:334] "Generic (PLEG): container finished" podID="155e4622-3a8e-433c-a997-147c918dc08c" containerID="48f0e5a436d222eba3652f05d969e4446d631ab169947050fb1559bd7327783b" exitCode=0 Feb 24 10:20:15 crc kubenswrapper[4698]: I0224 10:20:15.822799 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"155e4622-3a8e-433c-a997-147c918dc08c","Type":"ContainerDied","Data":"48f0e5a436d222eba3652f05d969e4446d631ab169947050fb1559bd7327783b"} Feb 24 10:20:15 crc kubenswrapper[4698]: I0224 10:20:15.826306 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ppmk4" event={"ID":"55418747-8c79-496a-9b89-68f9eaa3f01a","Type":"ContainerStarted","Data":"19df3b359278e17ef43c843a429e1d6694c29b28feddc379809fc373338c52b2"} Feb 24 10:20:15 crc kubenswrapper[4698]: I0224 10:20:15.828060 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"80fe6957-b898-4120-8959-a0e840e8c4f5","Type":"ContainerDied","Data":"8ef10187319d9f6ea9dde4eef8606a0900ef477e3636d80531f8566a78cf64d6"} Feb 24 10:20:15 crc kubenswrapper[4698]: I0224 10:20:15.828167 4698 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8ef10187319d9f6ea9dde4eef8606a0900ef477e3636d80531f8566a78cf64d6" Feb 24 10:20:15 crc kubenswrapper[4698]: I0224 10:20:15.828319 4698 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 24 10:20:15 crc kubenswrapper[4698]: I0224 10:20:15.912325 4698 patch_prober.go:28] interesting pod/router-default-5444994796-trk7h container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 24 10:20:15 crc kubenswrapper[4698]: [-]has-synced failed: reason withheld Feb 24 10:20:15 crc kubenswrapper[4698]: [+]process-running ok Feb 24 10:20:15 crc kubenswrapper[4698]: healthz check failed Feb 24 10:20:15 crc kubenswrapper[4698]: I0224 10:20:15.912389 4698 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-trk7h" podUID="b44b8c72-3ca2-4fbe-aa3f-9ab7917b1658" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 24 10:20:16 crc kubenswrapper[4698]: I0224 10:20:16.536338 4698 ???:1] "http: TLS handshake error from 192.168.126.11:60332: no serving certificate available for the kubelet" Feb 24 10:20:16 crc kubenswrapper[4698]: I0224 10:20:16.837736 4698 generic.go:334] "Generic (PLEG): container finished" podID="6f1af873-5e8f-4f75-81c2-c9b26ee37f2a" containerID="3eac69b7cc348cccf17e267bf0b0792df72bd7ce9641816eebe4c22c26e5ce6d" exitCode=0 Feb 24 10:20:16 crc kubenswrapper[4698]: I0224 10:20:16.837812 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dh74l" event={"ID":"6f1af873-5e8f-4f75-81c2-c9b26ee37f2a","Type":"ContainerDied","Data":"3eac69b7cc348cccf17e267bf0b0792df72bd7ce9641816eebe4c22c26e5ce6d"} Feb 24 10:20:16 crc kubenswrapper[4698]: I0224 10:20:16.840346 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-rpnnm" event={"ID":"17a1338b-6385-4795-9397-74316d6599d9","Type":"ContainerStarted","Data":"2d88fdba57e7acedfeb4571166d511fe7501843beea38c1c63aaf9f105e315bd"} Feb 24 10:20:16 crc kubenswrapper[4698]: I0224 10:20:16.845381 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ppmk4" event={"ID":"55418747-8c79-496a-9b89-68f9eaa3f01a","Type":"ContainerStarted","Data":"9d94e6b7b514c817e238b3de6823ddc2258e3d730b22674ceb4addc8ea7fa387"} Feb 24 10:20:16 crc kubenswrapper[4698]: I0224 10:20:16.910535 4698 patch_prober.go:28] interesting pod/router-default-5444994796-trk7h container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 24 10:20:16 crc kubenswrapper[4698]: [+]has-synced ok Feb 24 10:20:16 crc kubenswrapper[4698]: [+]process-running ok Feb 24 10:20:16 crc kubenswrapper[4698]: healthz check failed Feb 24 10:20:16 crc kubenswrapper[4698]: I0224 10:20:16.911371 4698 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-trk7h" podUID="b44b8c72-3ca2-4fbe-aa3f-9ab7917b1658" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 24 10:20:17 crc kubenswrapper[4698]: I0224 10:20:17.162764 4698 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 24 10:20:17 crc kubenswrapper[4698]: I0224 10:20:17.221579 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/155e4622-3a8e-433c-a997-147c918dc08c-kube-api-access\") pod \"155e4622-3a8e-433c-a997-147c918dc08c\" (UID: \"155e4622-3a8e-433c-a997-147c918dc08c\") " Feb 24 10:20:17 crc kubenswrapper[4698]: I0224 10:20:17.221700 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/155e4622-3a8e-433c-a997-147c918dc08c-kubelet-dir\") pod \"155e4622-3a8e-433c-a997-147c918dc08c\" (UID: \"155e4622-3a8e-433c-a997-147c918dc08c\") " Feb 24 10:20:17 crc kubenswrapper[4698]: I0224 10:20:17.221919 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/155e4622-3a8e-433c-a997-147c918dc08c-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "155e4622-3a8e-433c-a997-147c918dc08c" (UID: "155e4622-3a8e-433c-a997-147c918dc08c"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 10:20:17 crc kubenswrapper[4698]: I0224 10:20:17.227337 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/155e4622-3a8e-433c-a997-147c918dc08c-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "155e4622-3a8e-433c-a997-147c918dc08c" (UID: "155e4622-3a8e-433c-a997-147c918dc08c"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:20:17 crc kubenswrapper[4698]: I0224 10:20:17.322688 4698 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/155e4622-3a8e-433c-a997-147c918dc08c-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 24 10:20:17 crc kubenswrapper[4698]: I0224 10:20:17.322714 4698 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/155e4622-3a8e-433c-a997-147c918dc08c-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 24 10:20:17 crc kubenswrapper[4698]: I0224 10:20:17.861475 4698 generic.go:334] "Generic (PLEG): container finished" podID="291ba94f-a9ac-4d5c-8476-221496078d80" containerID="6a2b14809b5503e2af70b5402fd1ed78253f7bc05b307b8c2d9edae83978c36a" exitCode=0 Feb 24 10:20:17 crc kubenswrapper[4698]: I0224 10:20:17.861548 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l7zvp" event={"ID":"291ba94f-a9ac-4d5c-8476-221496078d80","Type":"ContainerDied","Data":"6a2b14809b5503e2af70b5402fd1ed78253f7bc05b307b8c2d9edae83978c36a"} Feb 24 10:20:17 crc kubenswrapper[4698]: I0224 10:20:17.864083 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"155e4622-3a8e-433c-a997-147c918dc08c","Type":"ContainerDied","Data":"84f73020f3442863483e8e17893fa4291f1adf5d816c039cb5f1dd334e92908c"} Feb 24 10:20:17 crc kubenswrapper[4698]: I0224 10:20:17.864106 4698 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 24 10:20:17 crc kubenswrapper[4698]: I0224 10:20:17.864112 4698 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="84f73020f3442863483e8e17893fa4291f1adf5d816c039cb5f1dd334e92908c" Feb 24 10:20:17 crc kubenswrapper[4698]: I0224 10:20:17.865595 4698 generic.go:334] "Generic (PLEG): container finished" podID="55418747-8c79-496a-9b89-68f9eaa3f01a" containerID="9d94e6b7b514c817e238b3de6823ddc2258e3d730b22674ceb4addc8ea7fa387" exitCode=0 Feb 24 10:20:17 crc kubenswrapper[4698]: I0224 10:20:17.865664 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ppmk4" event={"ID":"55418747-8c79-496a-9b89-68f9eaa3f01a","Type":"ContainerDied","Data":"9d94e6b7b514c817e238b3de6823ddc2258e3d730b22674ceb4addc8ea7fa387"} Feb 24 10:20:17 crc kubenswrapper[4698]: I0224 10:20:17.911975 4698 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-trk7h" Feb 24 10:20:17 crc kubenswrapper[4698]: I0224 10:20:17.920095 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-trk7h" Feb 24 10:20:18 crc kubenswrapper[4698]: I0224 10:20:18.791043 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-6h5bj" Feb 24 10:20:19 crc kubenswrapper[4698]: I0224 10:20:19.880815 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-rpnnm" event={"ID":"17a1338b-6385-4795-9397-74316d6599d9","Type":"ContainerStarted","Data":"561a11d8fb20e98f19897c6aa5e3d0242732a03858de8a48cc4d8fcbccf3cd02"} Feb 24 10:20:19 crc kubenswrapper[4698]: I0224 10:20:19.892963 4698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-rpnnm" podStartSLOduration=201.892923847 podStartE2EDuration="3m21.892923847s" podCreationTimestamp="2026-02-24 10:16:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:20:19.89223018 +0000 UTC m=+245.005844411" watchObservedRunningTime="2026-02-24 10:20:19.892923847 +0000 UTC m=+245.006538088" Feb 24 10:20:19 crc kubenswrapper[4698]: I0224 10:20:19.991462 4698 ???:1] "http: TLS handshake error from 192.168.126.11:60338: no serving certificate available for the kubelet" Feb 24 10:20:22 crc kubenswrapper[4698]: I0224 10:20:22.197178 4698 patch_prober.go:28] interesting pod/machine-config-daemon-nn578 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 10:20:22 crc kubenswrapper[4698]: I0224 10:20:22.197235 4698 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nn578" podUID="b4ee0bb1-125d-4852-a54d-7dadf6177545" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 10:20:23 crc kubenswrapper[4698]: I0224 10:20:23.533142 4698 patch_prober.go:28] interesting pod/console-f9d7485db-clwmh container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Feb 24 10:20:23 crc kubenswrapper[4698]: I0224 10:20:23.533455 4698 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-clwmh" podUID="348d0b48-f2a9-4326-b8c8-88f43029f382" containerName="console" probeResult="failure" output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" Feb 24 10:20:23 crc kubenswrapper[4698]: I0224 10:20:23.586564 4698 patch_prober.go:28] interesting pod/downloads-7954f5f757-z42jf container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" start-of-body= Feb 24 10:20:23 crc kubenswrapper[4698]: I0224 10:20:23.586597 4698 patch_prober.go:28] interesting pod/downloads-7954f5f757-z42jf container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" start-of-body= Feb 24 10:20:23 crc kubenswrapper[4698]: I0224 10:20:23.586650 4698 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-z42jf" podUID="108d72f5-0dd9-4965-a41f-7403ad8fce04" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" Feb 24 10:20:23 crc kubenswrapper[4698]: I0224 10:20:23.586658 4698 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-z42jf" podUID="108d72f5-0dd9-4965-a41f-7403ad8fce04" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" Feb 24 10:20:26 crc kubenswrapper[4698]: I0224 10:20:26.807079 4698 ???:1] "http: TLS handshake error from 192.168.126.11:34274: no serving certificate available for the kubelet" Feb 24 10:20:29 crc kubenswrapper[4698]: I0224 10:20:29.456336 4698 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-d856f7db-zpnrp"] Feb 24 10:20:29 crc kubenswrapper[4698]: I0224 10:20:29.456673 4698 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-d856f7db-zpnrp" podUID="68542d0e-8ec2-4509-b5b2-2ec3d9a0de83" containerName="controller-manager" containerID="cri-o://8508a32fb34c581a4e249c2347b593ee7a48baac6c6f739d0875a1bf90267655" gracePeriod=30 Feb 24 10:20:29 crc kubenswrapper[4698]: I0224 10:20:29.480854 4698 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b56f5b4c8-c4lgc"] Feb 24 10:20:29 crc kubenswrapper[4698]: I0224 10:20:29.481219 4698 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5b56f5b4c8-c4lgc" podUID="1925aa4a-90ca-4879-8030-1809fec30eb7" containerName="route-controller-manager" containerID="cri-o://0cb0c35e7654e88f01475f7d4e84f8f797f5e57119b632e704072c3b4092efd4" gracePeriod=30 Feb 24 10:20:29 crc kubenswrapper[4698]: I0224 10:20:29.946028 4698 generic.go:334] "Generic (PLEG): container finished" podID="1925aa4a-90ca-4879-8030-1809fec30eb7" containerID="0cb0c35e7654e88f01475f7d4e84f8f797f5e57119b632e704072c3b4092efd4" exitCode=0 Feb 24 10:20:29 crc kubenswrapper[4698]: I0224 10:20:29.946154 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5b56f5b4c8-c4lgc" event={"ID":"1925aa4a-90ca-4879-8030-1809fec30eb7","Type":"ContainerDied","Data":"0cb0c35e7654e88f01475f7d4e84f8f797f5e57119b632e704072c3b4092efd4"} Feb 24 10:20:29 crc kubenswrapper[4698]: I0224 10:20:29.947827 4698 generic.go:334] "Generic (PLEG): container finished" podID="68542d0e-8ec2-4509-b5b2-2ec3d9a0de83" containerID="8508a32fb34c581a4e249c2347b593ee7a48baac6c6f739d0875a1bf90267655" exitCode=0 Feb 24 10:20:29 crc kubenswrapper[4698]: I0224 10:20:29.947862 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-d856f7db-zpnrp" event={"ID":"68542d0e-8ec2-4509-b5b2-2ec3d9a0de83","Type":"ContainerDied","Data":"8508a32fb34c581a4e249c2347b593ee7a48baac6c6f739d0875a1bf90267655"} Feb 24 10:20:31 crc kubenswrapper[4698]: I0224 10:20:31.985936 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-bmr2l" Feb 24 10:20:32 crc kubenswrapper[4698]: I0224 10:20:32.142725 4698 patch_prober.go:28] interesting pod/route-controller-manager-5b56f5b4c8-c4lgc container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.49:8443/healthz\": dial tcp 10.217.0.49:8443: connect: connection refused" start-of-body= Feb 24 10:20:32 crc kubenswrapper[4698]: I0224 10:20:32.142797 4698 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-5b56f5b4c8-c4lgc" podUID="1925aa4a-90ca-4879-8030-1809fec30eb7" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.49:8443/healthz\": dial tcp 10.217.0.49:8443: connect: connection refused" Feb 24 10:20:32 crc kubenswrapper[4698]: I0224 10:20:32.164650 4698 patch_prober.go:28] interesting pod/controller-manager-d856f7db-zpnrp container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.50:8443/healthz\": dial tcp 10.217.0.50:8443: connect: connection refused" start-of-body= Feb 24 10:20:32 crc kubenswrapper[4698]: I0224 10:20:32.164724 4698 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-d856f7db-zpnrp" podUID="68542d0e-8ec2-4509-b5b2-2ec3d9a0de83" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.50:8443/healthz\": dial tcp 10.217.0.50:8443: connect: connection refused" Feb 24 10:20:33 crc kubenswrapper[4698]: I0224 10:20:33.546755 4698 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-clwmh" Feb 24 10:20:33 crc kubenswrapper[4698]: I0224 10:20:33.552115 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-clwmh" Feb 24 10:20:33 crc kubenswrapper[4698]: I0224 10:20:33.588129 4698 patch_prober.go:28] interesting pod/downloads-7954f5f757-z42jf container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" start-of-body= Feb 24 10:20:33 crc kubenswrapper[4698]: I0224 10:20:33.588208 4698 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-z42jf" podUID="108d72f5-0dd9-4965-a41f-7403ad8fce04" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" Feb 24 10:20:33 crc kubenswrapper[4698]: I0224 10:20:33.588284 4698 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console/downloads-7954f5f757-z42jf" Feb 24 10:20:33 crc kubenswrapper[4698]: I0224 10:20:33.588357 4698 patch_prober.go:28] interesting pod/downloads-7954f5f757-z42jf container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" start-of-body= Feb 24 10:20:33 crc kubenswrapper[4698]: I0224 10:20:33.588413 4698 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-z42jf" podUID="108d72f5-0dd9-4965-a41f-7403ad8fce04" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" Feb 24 10:20:33 crc kubenswrapper[4698]: I0224 10:20:33.588666 4698 patch_prober.go:28] interesting pod/downloads-7954f5f757-z42jf container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" start-of-body= Feb 24 10:20:33 crc kubenswrapper[4698]: I0224 10:20:33.588694 4698 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-z42jf" podUID="108d72f5-0dd9-4965-a41f-7403ad8fce04" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" Feb 24 10:20:33 crc kubenswrapper[4698]: I0224 10:20:33.588907 4698 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="download-server" containerStatusID={"Type":"cri-o","ID":"34254fe153c37c2de36dbd7cc42747f42149dacd16b6044db1672740dbeaf1a9"} pod="openshift-console/downloads-7954f5f757-z42jf" containerMessage="Container download-server failed liveness probe, will be restarted" Feb 24 10:20:33 crc kubenswrapper[4698]: I0224 10:20:33.588935 4698 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/downloads-7954f5f757-z42jf" podUID="108d72f5-0dd9-4965-a41f-7403ad8fce04" containerName="download-server" containerID="cri-o://34254fe153c37c2de36dbd7cc42747f42149dacd16b6044db1672740dbeaf1a9" gracePeriod=2 Feb 24 10:20:34 crc kubenswrapper[4698]: I0224 10:20:34.972815 4698 generic.go:334] "Generic (PLEG): container finished" podID="108d72f5-0dd9-4965-a41f-7403ad8fce04" containerID="34254fe153c37c2de36dbd7cc42747f42149dacd16b6044db1672740dbeaf1a9" exitCode=0 Feb 24 10:20:34 crc kubenswrapper[4698]: I0224 10:20:34.972861 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-z42jf" event={"ID":"108d72f5-0dd9-4965-a41f-7403ad8fce04","Type":"ContainerDied","Data":"34254fe153c37c2de36dbd7cc42747f42149dacd16b6044db1672740dbeaf1a9"} Feb 24 10:20:39 crc kubenswrapper[4698]: I0224 10:20:39.741155 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 24 10:20:43 crc kubenswrapper[4698]: I0224 10:20:43.143017 4698 patch_prober.go:28] interesting pod/route-controller-manager-5b56f5b4c8-c4lgc container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.49:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 24 10:20:43 crc kubenswrapper[4698]: I0224 10:20:43.143426 4698 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-5b56f5b4c8-c4lgc" podUID="1925aa4a-90ca-4879-8030-1809fec30eb7" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.49:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 24 10:20:43 crc kubenswrapper[4698]: I0224 10:20:43.165191 4698 patch_prober.go:28] interesting pod/controller-manager-d856f7db-zpnrp container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.50:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 24 10:20:43 crc kubenswrapper[4698]: I0224 10:20:43.165249 4698 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-d856f7db-zpnrp" podUID="68542d0e-8ec2-4509-b5b2-2ec3d9a0de83" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.50:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 24 10:20:43 crc kubenswrapper[4698]: I0224 10:20:43.464706 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rkxpp" Feb 24 10:20:43 crc kubenswrapper[4698]: I0224 10:20:43.586926 4698 patch_prober.go:28] interesting pod/downloads-7954f5f757-z42jf container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" start-of-body= Feb 24 10:20:43 crc kubenswrapper[4698]: I0224 10:20:43.587004 4698 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-z42jf" podUID="108d72f5-0dd9-4965-a41f-7403ad8fce04" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" Feb 24 10:20:44 crc kubenswrapper[4698]: I0224 10:20:44.774050 4698 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-79f62 container/marketplace-operator namespace/openshift-marketplace: Liveness probe status=failure output="Get \"http://10.217.0.25:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 24 10:20:44 crc kubenswrapper[4698]: I0224 10:20:44.774129 4698 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/marketplace-operator-79b997595-79f62" podUID="d0de08e0-63c0-4a90-a264-1bc41b8746d8" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.25:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 24 10:20:44 crc kubenswrapper[4698]: I0224 10:20:44.774497 4698 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-79f62 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 24 10:20:44 crc kubenswrapper[4698]: I0224 10:20:44.774528 4698 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-79f62" podUID="d0de08e0-63c0-4a90-a264-1bc41b8746d8" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.25:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 24 10:20:47 crc kubenswrapper[4698]: I0224 10:20:47.046804 4698 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 24 10:20:47 crc kubenswrapper[4698]: E0224 10:20:47.047368 4698 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80fe6957-b898-4120-8959-a0e840e8c4f5" containerName="pruner" Feb 24 10:20:47 crc kubenswrapper[4698]: I0224 10:20:47.047403 4698 state_mem.go:107] "Deleted CPUSet assignment" podUID="80fe6957-b898-4120-8959-a0e840e8c4f5" containerName="pruner" Feb 24 10:20:47 crc kubenswrapper[4698]: E0224 10:20:47.047430 4698 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="155e4622-3a8e-433c-a997-147c918dc08c" containerName="pruner" Feb 24 10:20:47 crc kubenswrapper[4698]: I0224 10:20:47.047446 4698 state_mem.go:107] "Deleted CPUSet assignment" podUID="155e4622-3a8e-433c-a997-147c918dc08c" containerName="pruner" Feb 24 10:20:47 crc kubenswrapper[4698]: E0224 10:20:47.047469 4698 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97c3a4a8-9e33-4012-9b16-5a0de6e0ace9" containerName="collect-profiles" Feb 24 10:20:47 crc kubenswrapper[4698]: I0224 10:20:47.047487 4698 state_mem.go:107] "Deleted CPUSet assignment" podUID="97c3a4a8-9e33-4012-9b16-5a0de6e0ace9" containerName="collect-profiles" Feb 24 10:20:47 crc kubenswrapper[4698]: I0224 10:20:47.047722 4698 memory_manager.go:354] "RemoveStaleState removing state" podUID="97c3a4a8-9e33-4012-9b16-5a0de6e0ace9" containerName="collect-profiles" Feb 24 10:20:47 crc kubenswrapper[4698]: I0224 10:20:47.047753 4698 memory_manager.go:354] "RemoveStaleState removing state" podUID="155e4622-3a8e-433c-a997-147c918dc08c" containerName="pruner" Feb 24 10:20:47 crc kubenswrapper[4698]: I0224 10:20:47.047774 4698 memory_manager.go:354] "RemoveStaleState removing state" podUID="80fe6957-b898-4120-8959-a0e840e8c4f5" containerName="pruner" Feb 24 10:20:47 crc kubenswrapper[4698]: I0224 10:20:47.048747 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 24 10:20:47 crc kubenswrapper[4698]: I0224 10:20:47.055870 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 24 10:20:47 crc kubenswrapper[4698]: I0224 10:20:47.056172 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 24 10:20:47 crc kubenswrapper[4698]: I0224 10:20:47.070328 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 24 10:20:47 crc kubenswrapper[4698]: I0224 10:20:47.156443 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4928b0cb-1f5d-4242-96e1-014e01e5d2d4-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"4928b0cb-1f5d-4242-96e1-014e01e5d2d4\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 24 10:20:47 crc kubenswrapper[4698]: I0224 10:20:47.156495 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4928b0cb-1f5d-4242-96e1-014e01e5d2d4-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"4928b0cb-1f5d-4242-96e1-014e01e5d2d4\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 24 10:20:47 crc kubenswrapper[4698]: I0224 10:20:47.257652 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4928b0cb-1f5d-4242-96e1-014e01e5d2d4-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"4928b0cb-1f5d-4242-96e1-014e01e5d2d4\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 24 10:20:47 crc kubenswrapper[4698]: I0224 10:20:47.257722 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4928b0cb-1f5d-4242-96e1-014e01e5d2d4-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"4928b0cb-1f5d-4242-96e1-014e01e5d2d4\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 24 10:20:47 crc kubenswrapper[4698]: I0224 10:20:47.257943 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4928b0cb-1f5d-4242-96e1-014e01e5d2d4-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"4928b0cb-1f5d-4242-96e1-014e01e5d2d4\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 24 10:20:47 crc kubenswrapper[4698]: I0224 10:20:47.293578 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4928b0cb-1f5d-4242-96e1-014e01e5d2d4-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"4928b0cb-1f5d-4242-96e1-014e01e5d2d4\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 24 10:20:47 crc kubenswrapper[4698]: I0224 10:20:47.423761 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 24 10:20:51 crc kubenswrapper[4698]: E0224 10:20:51.139049 4698 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Feb 24 10:20:51 crc kubenswrapper[4698]: E0224 10:20:51.139423 4698 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mq2ws,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-p9jbm_openshift-marketplace(25f4eaf1-6171-44dd-b225-be712a45ba1b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 24 10:20:51 crc kubenswrapper[4698]: E0224 10:20:51.140784 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-p9jbm" podUID="25f4eaf1-6171-44dd-b225-be712a45ba1b" Feb 24 10:20:51 crc kubenswrapper[4698]: E0224 10:20:51.189190 4698 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Feb 24 10:20:51 crc kubenswrapper[4698]: E0224 10:20:51.189382 4698 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-b27gl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-p5dwb_openshift-marketplace(19022af1-394c-4aab-9eb1-ffb0f566d0ac): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 24 10:20:51 crc kubenswrapper[4698]: E0224 10:20:51.190757 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-p5dwb" podUID="19022af1-394c-4aab-9eb1-ffb0f566d0ac" Feb 24 10:20:52 crc kubenswrapper[4698]: I0224 10:20:52.196984 4698 patch_prober.go:28] interesting pod/machine-config-daemon-nn578 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 10:20:52 crc kubenswrapper[4698]: I0224 10:20:52.198193 4698 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nn578" podUID="b4ee0bb1-125d-4852-a54d-7dadf6177545" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 10:20:52 crc kubenswrapper[4698]: I0224 10:20:52.198309 4698 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-nn578" Feb 24 10:20:52 crc kubenswrapper[4698]: I0224 10:20:52.199525 4698 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a0c8bc2bc5ebfb2472863808bf33f95f8aa74ed45b546ed1a1b3be4883af700e"} pod="openshift-machine-config-operator/machine-config-daemon-nn578" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 24 10:20:52 crc kubenswrapper[4698]: I0224 10:20:52.200296 4698 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-nn578" podUID="b4ee0bb1-125d-4852-a54d-7dadf6177545" containerName="machine-config-daemon" containerID="cri-o://a0c8bc2bc5ebfb2472863808bf33f95f8aa74ed45b546ed1a1b3be4883af700e" gracePeriod=600 Feb 24 10:20:52 crc kubenswrapper[4698]: I0224 10:20:52.449601 4698 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 24 10:20:52 crc kubenswrapper[4698]: I0224 10:20:52.451161 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 24 10:20:52 crc kubenswrapper[4698]: I0224 10:20:52.478405 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 24 10:20:52 crc kubenswrapper[4698]: I0224 10:20:52.641404 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d03f18c4-57c2-4d45-9b43-0b4fbf8f4a41-var-lock\") pod \"installer-9-crc\" (UID: \"d03f18c4-57c2-4d45-9b43-0b4fbf8f4a41\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 24 10:20:52 crc kubenswrapper[4698]: I0224 10:20:52.641527 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d03f18c4-57c2-4d45-9b43-0b4fbf8f4a41-kubelet-dir\") pod \"installer-9-crc\" (UID: \"d03f18c4-57c2-4d45-9b43-0b4fbf8f4a41\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 24 10:20:52 crc kubenswrapper[4698]: I0224 10:20:52.642086 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d03f18c4-57c2-4d45-9b43-0b4fbf8f4a41-kube-api-access\") pod \"installer-9-crc\" (UID: \"d03f18c4-57c2-4d45-9b43-0b4fbf8f4a41\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 24 10:20:52 crc kubenswrapper[4698]: I0224 10:20:52.743130 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d03f18c4-57c2-4d45-9b43-0b4fbf8f4a41-kubelet-dir\") pod \"installer-9-crc\" (UID: \"d03f18c4-57c2-4d45-9b43-0b4fbf8f4a41\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 24 10:20:52 crc kubenswrapper[4698]: I0224 10:20:52.743296 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d03f18c4-57c2-4d45-9b43-0b4fbf8f4a41-kube-api-access\") pod \"installer-9-crc\" (UID: \"d03f18c4-57c2-4d45-9b43-0b4fbf8f4a41\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 24 10:20:52 crc kubenswrapper[4698]: I0224 10:20:52.743382 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d03f18c4-57c2-4d45-9b43-0b4fbf8f4a41-var-lock\") pod \"installer-9-crc\" (UID: \"d03f18c4-57c2-4d45-9b43-0b4fbf8f4a41\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 24 10:20:52 crc kubenswrapper[4698]: I0224 10:20:52.743430 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d03f18c4-57c2-4d45-9b43-0b4fbf8f4a41-kubelet-dir\") pod \"installer-9-crc\" (UID: \"d03f18c4-57c2-4d45-9b43-0b4fbf8f4a41\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 24 10:20:52 crc kubenswrapper[4698]: I0224 10:20:52.743504 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d03f18c4-57c2-4d45-9b43-0b4fbf8f4a41-var-lock\") pod \"installer-9-crc\" (UID: \"d03f18c4-57c2-4d45-9b43-0b4fbf8f4a41\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 24 10:20:52 crc kubenswrapper[4698]: I0224 10:20:52.776626 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d03f18c4-57c2-4d45-9b43-0b4fbf8f4a41-kube-api-access\") pod \"installer-9-crc\" (UID: \"d03f18c4-57c2-4d45-9b43-0b4fbf8f4a41\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 24 10:20:52 crc kubenswrapper[4698]: I0224 10:20:52.784468 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 24 10:20:53 crc kubenswrapper[4698]: I0224 10:20:53.142708 4698 patch_prober.go:28] interesting pod/route-controller-manager-5b56f5b4c8-c4lgc container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.49:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 24 10:20:53 crc kubenswrapper[4698]: I0224 10:20:53.143520 4698 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-5b56f5b4c8-c4lgc" podUID="1925aa4a-90ca-4879-8030-1809fec30eb7" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.49:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 24 10:20:53 crc kubenswrapper[4698]: I0224 10:20:53.166020 4698 patch_prober.go:28] interesting pod/controller-manager-d856f7db-zpnrp container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.50:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 24 10:20:53 crc kubenswrapper[4698]: I0224 10:20:53.166625 4698 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-d856f7db-zpnrp" podUID="68542d0e-8ec2-4509-b5b2-2ec3d9a0de83" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.50:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 24 10:20:53 crc kubenswrapper[4698]: I0224 10:20:53.587000 4698 patch_prober.go:28] interesting pod/downloads-7954f5f757-z42jf container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" start-of-body= Feb 24 10:20:57 crc kubenswrapper[4698]: I0224 10:20:53.587093 4698 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-z42jf" podUID="108d72f5-0dd9-4965-a41f-7403ad8fce04" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" Feb 24 10:20:57 crc kubenswrapper[4698]: I0224 10:20:54.917471 4698 generic.go:334] "Generic (PLEG): container finished" podID="b4ee0bb1-125d-4852-a54d-7dadf6177545" containerID="a0c8bc2bc5ebfb2472863808bf33f95f8aa74ed45b546ed1a1b3be4883af700e" exitCode=0 Feb 24 10:20:57 crc kubenswrapper[4698]: I0224 10:20:54.917536 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-nn578" event={"ID":"b4ee0bb1-125d-4852-a54d-7dadf6177545","Type":"ContainerDied","Data":"a0c8bc2bc5ebfb2472863808bf33f95f8aa74ed45b546ed1a1b3be4883af700e"} Feb 24 10:20:57 crc kubenswrapper[4698]: E0224 10:20:57.164954 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-p9jbm" podUID="25f4eaf1-6171-44dd-b225-be712a45ba1b" Feb 24 10:20:57 crc kubenswrapper[4698]: E0224 10:20:57.164979 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-p5dwb" podUID="19022af1-394c-4aab-9eb1-ffb0f566d0ac" Feb 24 10:20:58 crc kubenswrapper[4698]: E0224 10:20:58.694197 4698 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Feb 24 10:20:58 crc kubenswrapper[4698]: E0224 10:20:58.694676 4698 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-w26kv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-2z8bk_openshift-marketplace(5149fd4f-19d7-4852-b09a-d9909b8231dd): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 24 10:20:58 crc kubenswrapper[4698]: E0224 10:20:58.695893 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-2z8bk" podUID="5149fd4f-19d7-4852-b09a-d9909b8231dd" Feb 24 10:20:58 crc kubenswrapper[4698]: E0224 10:20:58.712189 4698 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Feb 24 10:20:58 crc kubenswrapper[4698]: E0224 10:20:58.712546 4698 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jjsf5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-dh74l_openshift-marketplace(6f1af873-5e8f-4f75-81c2-c9b26ee37f2a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 24 10:20:58 crc kubenswrapper[4698]: E0224 10:20:58.714028 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-dh74l" podUID="6f1af873-5e8f-4f75-81c2-c9b26ee37f2a" Feb 24 10:20:58 crc kubenswrapper[4698]: I0224 10:20:58.756142 4698 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-d856f7db-zpnrp" Feb 24 10:20:58 crc kubenswrapper[4698]: I0224 10:20:58.764296 4698 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5b56f5b4c8-c4lgc" Feb 24 10:20:58 crc kubenswrapper[4698]: I0224 10:20:58.792364 4698 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5795d455f6-j5sh9"] Feb 24 10:20:58 crc kubenswrapper[4698]: E0224 10:20:58.792602 4698 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68542d0e-8ec2-4509-b5b2-2ec3d9a0de83" containerName="controller-manager" Feb 24 10:20:58 crc kubenswrapper[4698]: I0224 10:20:58.792616 4698 state_mem.go:107] "Deleted CPUSet assignment" podUID="68542d0e-8ec2-4509-b5b2-2ec3d9a0de83" containerName="controller-manager" Feb 24 10:20:58 crc kubenswrapper[4698]: E0224 10:20:58.792630 4698 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1925aa4a-90ca-4879-8030-1809fec30eb7" containerName="route-controller-manager" Feb 24 10:20:58 crc kubenswrapper[4698]: I0224 10:20:58.792639 4698 state_mem.go:107] "Deleted CPUSet assignment" podUID="1925aa4a-90ca-4879-8030-1809fec30eb7" containerName="route-controller-manager" Feb 24 10:20:58 crc kubenswrapper[4698]: I0224 10:20:58.792763 4698 memory_manager.go:354] "RemoveStaleState removing state" podUID="68542d0e-8ec2-4509-b5b2-2ec3d9a0de83" containerName="controller-manager" Feb 24 10:20:58 crc kubenswrapper[4698]: I0224 10:20:58.792780 4698 memory_manager.go:354] "RemoveStaleState removing state" podUID="1925aa4a-90ca-4879-8030-1809fec30eb7" containerName="route-controller-manager" Feb 24 10:20:58 crc kubenswrapper[4698]: I0224 10:20:58.793173 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5795d455f6-j5sh9" Feb 24 10:20:58 crc kubenswrapper[4698]: E0224 10:20:58.799079 4698 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Feb 24 10:20:58 crc kubenswrapper[4698]: E0224 10:20:58.799239 4698 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zz7zw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-8hhkf_openshift-marketplace(2eee2a16-171b-402e-9549-3d14cb56cddc): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 24 10:20:58 crc kubenswrapper[4698]: I0224 10:20:58.800641 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5795d455f6-j5sh9"] Feb 24 10:20:58 crc kubenswrapper[4698]: E0224 10:20:58.800667 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-8hhkf" podUID="2eee2a16-171b-402e-9549-3d14cb56cddc" Feb 24 10:20:58 crc kubenswrapper[4698]: I0224 10:20:58.837322 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68542d0e-8ec2-4509-b5b2-2ec3d9a0de83-config\") pod \"68542d0e-8ec2-4509-b5b2-2ec3d9a0de83\" (UID: \"68542d0e-8ec2-4509-b5b2-2ec3d9a0de83\") " Feb 24 10:20:58 crc kubenswrapper[4698]: I0224 10:20:58.837406 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/68542d0e-8ec2-4509-b5b2-2ec3d9a0de83-serving-cert\") pod \"68542d0e-8ec2-4509-b5b2-2ec3d9a0de83\" (UID: \"68542d0e-8ec2-4509-b5b2-2ec3d9a0de83\") " Feb 24 10:20:58 crc kubenswrapper[4698]: I0224 10:20:58.837440 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/68542d0e-8ec2-4509-b5b2-2ec3d9a0de83-proxy-ca-bundles\") pod \"68542d0e-8ec2-4509-b5b2-2ec3d9a0de83\" (UID: \"68542d0e-8ec2-4509-b5b2-2ec3d9a0de83\") " Feb 24 10:20:58 crc kubenswrapper[4698]: I0224 10:20:58.837464 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1925aa4a-90ca-4879-8030-1809fec30eb7-serving-cert\") pod \"1925aa4a-90ca-4879-8030-1809fec30eb7\" (UID: \"1925aa4a-90ca-4879-8030-1809fec30eb7\") " Feb 24 10:20:58 crc kubenswrapper[4698]: I0224 10:20:58.837493 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/68542d0e-8ec2-4509-b5b2-2ec3d9a0de83-client-ca\") pod \"68542d0e-8ec2-4509-b5b2-2ec3d9a0de83\" (UID: \"68542d0e-8ec2-4509-b5b2-2ec3d9a0de83\") " Feb 24 10:20:58 crc kubenswrapper[4698]: I0224 10:20:58.837515 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qqbnn\" (UniqueName: \"kubernetes.io/projected/68542d0e-8ec2-4509-b5b2-2ec3d9a0de83-kube-api-access-qqbnn\") pod \"68542d0e-8ec2-4509-b5b2-2ec3d9a0de83\" (UID: \"68542d0e-8ec2-4509-b5b2-2ec3d9a0de83\") " Feb 24 10:20:58 crc kubenswrapper[4698]: I0224 10:20:58.837548 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1925aa4a-90ca-4879-8030-1809fec30eb7-config\") pod \"1925aa4a-90ca-4879-8030-1809fec30eb7\" (UID: \"1925aa4a-90ca-4879-8030-1809fec30eb7\") " Feb 24 10:20:58 crc kubenswrapper[4698]: I0224 10:20:58.837571 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1925aa4a-90ca-4879-8030-1809fec30eb7-client-ca\") pod \"1925aa4a-90ca-4879-8030-1809fec30eb7\" (UID: \"1925aa4a-90ca-4879-8030-1809fec30eb7\") " Feb 24 10:20:58 crc kubenswrapper[4698]: I0224 10:20:58.837603 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hwjzt\" (UniqueName: \"kubernetes.io/projected/1925aa4a-90ca-4879-8030-1809fec30eb7-kube-api-access-hwjzt\") pod \"1925aa4a-90ca-4879-8030-1809fec30eb7\" (UID: \"1925aa4a-90ca-4879-8030-1809fec30eb7\") " Feb 24 10:20:58 crc kubenswrapper[4698]: I0224 10:20:58.837761 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/eeb6c0fe-63cc-404a-a628-885660e52dc9-client-ca\") pod \"controller-manager-5795d455f6-j5sh9\" (UID: \"eeb6c0fe-63cc-404a-a628-885660e52dc9\") " pod="openshift-controller-manager/controller-manager-5795d455f6-j5sh9" Feb 24 10:20:58 crc kubenswrapper[4698]: I0224 10:20:58.837808 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/eeb6c0fe-63cc-404a-a628-885660e52dc9-proxy-ca-bundles\") pod \"controller-manager-5795d455f6-j5sh9\" (UID: \"eeb6c0fe-63cc-404a-a628-885660e52dc9\") " pod="openshift-controller-manager/controller-manager-5795d455f6-j5sh9" Feb 24 10:20:58 crc kubenswrapper[4698]: I0224 10:20:58.837839 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7jd2\" (UniqueName: \"kubernetes.io/projected/eeb6c0fe-63cc-404a-a628-885660e52dc9-kube-api-access-w7jd2\") pod \"controller-manager-5795d455f6-j5sh9\" (UID: \"eeb6c0fe-63cc-404a-a628-885660e52dc9\") " pod="openshift-controller-manager/controller-manager-5795d455f6-j5sh9" Feb 24 10:20:58 crc kubenswrapper[4698]: I0224 10:20:58.837863 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eeb6c0fe-63cc-404a-a628-885660e52dc9-config\") pod \"controller-manager-5795d455f6-j5sh9\" (UID: \"eeb6c0fe-63cc-404a-a628-885660e52dc9\") " pod="openshift-controller-manager/controller-manager-5795d455f6-j5sh9" Feb 24 10:20:58 crc kubenswrapper[4698]: I0224 10:20:58.837893 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eeb6c0fe-63cc-404a-a628-885660e52dc9-serving-cert\") pod \"controller-manager-5795d455f6-j5sh9\" (UID: \"eeb6c0fe-63cc-404a-a628-885660e52dc9\") " pod="openshift-controller-manager/controller-manager-5795d455f6-j5sh9" Feb 24 10:20:58 crc kubenswrapper[4698]: I0224 10:20:58.838205 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68542d0e-8ec2-4509-b5b2-2ec3d9a0de83-client-ca" (OuterVolumeSpecName: "client-ca") pod "68542d0e-8ec2-4509-b5b2-2ec3d9a0de83" (UID: "68542d0e-8ec2-4509-b5b2-2ec3d9a0de83"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:20:58 crc kubenswrapper[4698]: I0224 10:20:58.838303 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68542d0e-8ec2-4509-b5b2-2ec3d9a0de83-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "68542d0e-8ec2-4509-b5b2-2ec3d9a0de83" (UID: "68542d0e-8ec2-4509-b5b2-2ec3d9a0de83"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:20:58 crc kubenswrapper[4698]: I0224 10:20:58.838861 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1925aa4a-90ca-4879-8030-1809fec30eb7-client-ca" (OuterVolumeSpecName: "client-ca") pod "1925aa4a-90ca-4879-8030-1809fec30eb7" (UID: "1925aa4a-90ca-4879-8030-1809fec30eb7"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:20:58 crc kubenswrapper[4698]: I0224 10:20:58.838979 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68542d0e-8ec2-4509-b5b2-2ec3d9a0de83-config" (OuterVolumeSpecName: "config") pod "68542d0e-8ec2-4509-b5b2-2ec3d9a0de83" (UID: "68542d0e-8ec2-4509-b5b2-2ec3d9a0de83"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:20:58 crc kubenswrapper[4698]: I0224 10:20:58.838992 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1925aa4a-90ca-4879-8030-1809fec30eb7-config" (OuterVolumeSpecName: "config") pod "1925aa4a-90ca-4879-8030-1809fec30eb7" (UID: "1925aa4a-90ca-4879-8030-1809fec30eb7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:20:58 crc kubenswrapper[4698]: I0224 10:20:58.845485 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1925aa4a-90ca-4879-8030-1809fec30eb7-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1925aa4a-90ca-4879-8030-1809fec30eb7" (UID: "1925aa4a-90ca-4879-8030-1809fec30eb7"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:20:58 crc kubenswrapper[4698]: I0224 10:20:58.845505 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68542d0e-8ec2-4509-b5b2-2ec3d9a0de83-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "68542d0e-8ec2-4509-b5b2-2ec3d9a0de83" (UID: "68542d0e-8ec2-4509-b5b2-2ec3d9a0de83"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:20:58 crc kubenswrapper[4698]: I0224 10:20:58.845604 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1925aa4a-90ca-4879-8030-1809fec30eb7-kube-api-access-hwjzt" (OuterVolumeSpecName: "kube-api-access-hwjzt") pod "1925aa4a-90ca-4879-8030-1809fec30eb7" (UID: "1925aa4a-90ca-4879-8030-1809fec30eb7"). InnerVolumeSpecName "kube-api-access-hwjzt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:20:58 crc kubenswrapper[4698]: I0224 10:20:58.858344 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68542d0e-8ec2-4509-b5b2-2ec3d9a0de83-kube-api-access-qqbnn" (OuterVolumeSpecName: "kube-api-access-qqbnn") pod "68542d0e-8ec2-4509-b5b2-2ec3d9a0de83" (UID: "68542d0e-8ec2-4509-b5b2-2ec3d9a0de83"). InnerVolumeSpecName "kube-api-access-qqbnn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:20:58 crc kubenswrapper[4698]: I0224 10:20:58.938281 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eeb6c0fe-63cc-404a-a628-885660e52dc9-config\") pod \"controller-manager-5795d455f6-j5sh9\" (UID: \"eeb6c0fe-63cc-404a-a628-885660e52dc9\") " pod="openshift-controller-manager/controller-manager-5795d455f6-j5sh9" Feb 24 10:20:58 crc kubenswrapper[4698]: I0224 10:20:58.938348 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eeb6c0fe-63cc-404a-a628-885660e52dc9-serving-cert\") pod \"controller-manager-5795d455f6-j5sh9\" (UID: \"eeb6c0fe-63cc-404a-a628-885660e52dc9\") " pod="openshift-controller-manager/controller-manager-5795d455f6-j5sh9" Feb 24 10:20:58 crc kubenswrapper[4698]: I0224 10:20:58.938377 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/eeb6c0fe-63cc-404a-a628-885660e52dc9-client-ca\") pod \"controller-manager-5795d455f6-j5sh9\" (UID: \"eeb6c0fe-63cc-404a-a628-885660e52dc9\") " pod="openshift-controller-manager/controller-manager-5795d455f6-j5sh9" Feb 24 10:20:58 crc kubenswrapper[4698]: I0224 10:20:58.938428 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/eeb6c0fe-63cc-404a-a628-885660e52dc9-proxy-ca-bundles\") pod \"controller-manager-5795d455f6-j5sh9\" (UID: \"eeb6c0fe-63cc-404a-a628-885660e52dc9\") " pod="openshift-controller-manager/controller-manager-5795d455f6-j5sh9" Feb 24 10:20:58 crc kubenswrapper[4698]: I0224 10:20:58.938469 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7jd2\" (UniqueName: \"kubernetes.io/projected/eeb6c0fe-63cc-404a-a628-885660e52dc9-kube-api-access-w7jd2\") pod \"controller-manager-5795d455f6-j5sh9\" (UID: \"eeb6c0fe-63cc-404a-a628-885660e52dc9\") " pod="openshift-controller-manager/controller-manager-5795d455f6-j5sh9" Feb 24 10:20:58 crc kubenswrapper[4698]: I0224 10:20:58.938525 4698 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/68542d0e-8ec2-4509-b5b2-2ec3d9a0de83-config\") on node \"crc\" DevicePath \"\"" Feb 24 10:20:58 crc kubenswrapper[4698]: I0224 10:20:58.938538 4698 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/68542d0e-8ec2-4509-b5b2-2ec3d9a0de83-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 10:20:58 crc kubenswrapper[4698]: I0224 10:20:58.938548 4698 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/68542d0e-8ec2-4509-b5b2-2ec3d9a0de83-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 24 10:20:58 crc kubenswrapper[4698]: I0224 10:20:58.938561 4698 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1925aa4a-90ca-4879-8030-1809fec30eb7-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 10:20:58 crc kubenswrapper[4698]: I0224 10:20:58.938572 4698 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/68542d0e-8ec2-4509-b5b2-2ec3d9a0de83-client-ca\") on node \"crc\" DevicePath \"\"" Feb 24 10:20:58 crc kubenswrapper[4698]: I0224 10:20:58.938583 4698 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qqbnn\" (UniqueName: \"kubernetes.io/projected/68542d0e-8ec2-4509-b5b2-2ec3d9a0de83-kube-api-access-qqbnn\") on node \"crc\" DevicePath \"\"" Feb 24 10:20:58 crc kubenswrapper[4698]: I0224 10:20:58.938593 4698 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1925aa4a-90ca-4879-8030-1809fec30eb7-config\") on node \"crc\" DevicePath \"\"" Feb 24 10:20:58 crc kubenswrapper[4698]: I0224 10:20:58.938604 4698 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1925aa4a-90ca-4879-8030-1809fec30eb7-client-ca\") on node \"crc\" DevicePath \"\"" Feb 24 10:20:58 crc kubenswrapper[4698]: I0224 10:20:58.938615 4698 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hwjzt\" (UniqueName: \"kubernetes.io/projected/1925aa4a-90ca-4879-8030-1809fec30eb7-kube-api-access-hwjzt\") on node \"crc\" DevicePath \"\"" Feb 24 10:20:58 crc kubenswrapper[4698]: I0224 10:20:58.939715 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/eeb6c0fe-63cc-404a-a628-885660e52dc9-proxy-ca-bundles\") pod \"controller-manager-5795d455f6-j5sh9\" (UID: \"eeb6c0fe-63cc-404a-a628-885660e52dc9\") " pod="openshift-controller-manager/controller-manager-5795d455f6-j5sh9" Feb 24 10:20:58 crc kubenswrapper[4698]: I0224 10:20:58.939747 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/eeb6c0fe-63cc-404a-a628-885660e52dc9-client-ca\") pod \"controller-manager-5795d455f6-j5sh9\" (UID: \"eeb6c0fe-63cc-404a-a628-885660e52dc9\") " pod="openshift-controller-manager/controller-manager-5795d455f6-j5sh9" Feb 24 10:20:58 crc kubenswrapper[4698]: I0224 10:20:58.940020 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eeb6c0fe-63cc-404a-a628-885660e52dc9-config\") pod \"controller-manager-5795d455f6-j5sh9\" (UID: \"eeb6c0fe-63cc-404a-a628-885660e52dc9\") " pod="openshift-controller-manager/controller-manager-5795d455f6-j5sh9" Feb 24 10:20:58 crc kubenswrapper[4698]: I0224 10:20:58.944974 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-d856f7db-zpnrp" event={"ID":"68542d0e-8ec2-4509-b5b2-2ec3d9a0de83","Type":"ContainerDied","Data":"42f2667a706c5bb23756d19d2e336fa403d8d55795f8059d6325bb540e14b49a"} Feb 24 10:20:58 crc kubenswrapper[4698]: I0224 10:20:58.944998 4698 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-d856f7db-zpnrp" Feb 24 10:20:58 crc kubenswrapper[4698]: I0224 10:20:58.945022 4698 scope.go:117] "RemoveContainer" containerID="8508a32fb34c581a4e249c2347b593ee7a48baac6c6f739d0875a1bf90267655" Feb 24 10:20:58 crc kubenswrapper[4698]: I0224 10:20:58.945054 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eeb6c0fe-63cc-404a-a628-885660e52dc9-serving-cert\") pod \"controller-manager-5795d455f6-j5sh9\" (UID: \"eeb6c0fe-63cc-404a-a628-885660e52dc9\") " pod="openshift-controller-manager/controller-manager-5795d455f6-j5sh9" Feb 24 10:20:58 crc kubenswrapper[4698]: I0224 10:20:58.950852 4698 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5b56f5b4c8-c4lgc" Feb 24 10:20:58 crc kubenswrapper[4698]: I0224 10:20:58.955736 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5b56f5b4c8-c4lgc" event={"ID":"1925aa4a-90ca-4879-8030-1809fec30eb7","Type":"ContainerDied","Data":"5babffb38530a77ca75c674ea7727e6b8045c76ad9c33402d1ad8434232809c4"} Feb 24 10:20:58 crc kubenswrapper[4698]: I0224 10:20:58.960746 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7jd2\" (UniqueName: \"kubernetes.io/projected/eeb6c0fe-63cc-404a-a628-885660e52dc9-kube-api-access-w7jd2\") pod \"controller-manager-5795d455f6-j5sh9\" (UID: \"eeb6c0fe-63cc-404a-a628-885660e52dc9\") " pod="openshift-controller-manager/controller-manager-5795d455f6-j5sh9" Feb 24 10:20:58 crc kubenswrapper[4698]: I0224 10:20:58.997913 4698 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-d856f7db-zpnrp"] Feb 24 10:20:59 crc kubenswrapper[4698]: I0224 10:20:59.004610 4698 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-d856f7db-zpnrp"] Feb 24 10:20:59 crc kubenswrapper[4698]: I0224 10:20:59.015396 4698 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b56f5b4c8-c4lgc"] Feb 24 10:20:59 crc kubenswrapper[4698]: I0224 10:20:59.019070 4698 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b56f5b4c8-c4lgc"] Feb 24 10:20:59 crc kubenswrapper[4698]: I0224 10:20:59.122651 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5795d455f6-j5sh9" Feb 24 10:20:59 crc kubenswrapper[4698]: I0224 10:20:59.621790 4698 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1925aa4a-90ca-4879-8030-1809fec30eb7" path="/var/lib/kubelet/pods/1925aa4a-90ca-4879-8030-1809fec30eb7/volumes" Feb 24 10:20:59 crc kubenswrapper[4698]: I0224 10:20:59.622523 4698 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68542d0e-8ec2-4509-b5b2-2ec3d9a0de83" path="/var/lib/kubelet/pods/68542d0e-8ec2-4509-b5b2-2ec3d9a0de83/volumes" Feb 24 10:21:00 crc kubenswrapper[4698]: I0224 10:21:00.813620 4698 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-dd4f6d495-p98zt"] Feb 24 10:21:00 crc kubenswrapper[4698]: I0224 10:21:00.814573 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-dd4f6d495-p98zt" Feb 24 10:21:00 crc kubenswrapper[4698]: I0224 10:21:00.816724 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 24 10:21:00 crc kubenswrapper[4698]: I0224 10:21:00.818402 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 24 10:21:00 crc kubenswrapper[4698]: I0224 10:21:00.818736 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 24 10:21:00 crc kubenswrapper[4698]: I0224 10:21:00.819762 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 24 10:21:00 crc kubenswrapper[4698]: I0224 10:21:00.819966 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 24 10:21:00 crc kubenswrapper[4698]: I0224 10:21:00.821367 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 24 10:21:00 crc kubenswrapper[4698]: I0224 10:21:00.823445 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-dd4f6d495-p98zt"] Feb 24 10:21:00 crc kubenswrapper[4698]: I0224 10:21:00.968120 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffa701b5-b0fd-46e8-b2dc-50daf6d2c30c-config\") pod \"route-controller-manager-dd4f6d495-p98zt\" (UID: \"ffa701b5-b0fd-46e8-b2dc-50daf6d2c30c\") " pod="openshift-route-controller-manager/route-controller-manager-dd4f6d495-p98zt" Feb 24 10:21:00 crc kubenswrapper[4698]: I0224 10:21:00.968163 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ps8rj\" (UniqueName: \"kubernetes.io/projected/ffa701b5-b0fd-46e8-b2dc-50daf6d2c30c-kube-api-access-ps8rj\") pod \"route-controller-manager-dd4f6d495-p98zt\" (UID: \"ffa701b5-b0fd-46e8-b2dc-50daf6d2c30c\") " pod="openshift-route-controller-manager/route-controller-manager-dd4f6d495-p98zt" Feb 24 10:21:00 crc kubenswrapper[4698]: I0224 10:21:00.968194 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ffa701b5-b0fd-46e8-b2dc-50daf6d2c30c-client-ca\") pod \"route-controller-manager-dd4f6d495-p98zt\" (UID: \"ffa701b5-b0fd-46e8-b2dc-50daf6d2c30c\") " pod="openshift-route-controller-manager/route-controller-manager-dd4f6d495-p98zt" Feb 24 10:21:00 crc kubenswrapper[4698]: I0224 10:21:00.968214 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ffa701b5-b0fd-46e8-b2dc-50daf6d2c30c-serving-cert\") pod \"route-controller-manager-dd4f6d495-p98zt\" (UID: \"ffa701b5-b0fd-46e8-b2dc-50daf6d2c30c\") " pod="openshift-route-controller-manager/route-controller-manager-dd4f6d495-p98zt" Feb 24 10:21:01 crc kubenswrapper[4698]: I0224 10:21:01.069301 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffa701b5-b0fd-46e8-b2dc-50daf6d2c30c-config\") pod \"route-controller-manager-dd4f6d495-p98zt\" (UID: \"ffa701b5-b0fd-46e8-b2dc-50daf6d2c30c\") " pod="openshift-route-controller-manager/route-controller-manager-dd4f6d495-p98zt" Feb 24 10:21:01 crc kubenswrapper[4698]: I0224 10:21:01.069351 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ps8rj\" (UniqueName: \"kubernetes.io/projected/ffa701b5-b0fd-46e8-b2dc-50daf6d2c30c-kube-api-access-ps8rj\") pod \"route-controller-manager-dd4f6d495-p98zt\" (UID: \"ffa701b5-b0fd-46e8-b2dc-50daf6d2c30c\") " pod="openshift-route-controller-manager/route-controller-manager-dd4f6d495-p98zt" Feb 24 10:21:01 crc kubenswrapper[4698]: I0224 10:21:01.069385 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ffa701b5-b0fd-46e8-b2dc-50daf6d2c30c-client-ca\") pod \"route-controller-manager-dd4f6d495-p98zt\" (UID: \"ffa701b5-b0fd-46e8-b2dc-50daf6d2c30c\") " pod="openshift-route-controller-manager/route-controller-manager-dd4f6d495-p98zt" Feb 24 10:21:01 crc kubenswrapper[4698]: I0224 10:21:01.069402 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ffa701b5-b0fd-46e8-b2dc-50daf6d2c30c-serving-cert\") pod \"route-controller-manager-dd4f6d495-p98zt\" (UID: \"ffa701b5-b0fd-46e8-b2dc-50daf6d2c30c\") " pod="openshift-route-controller-manager/route-controller-manager-dd4f6d495-p98zt" Feb 24 10:21:01 crc kubenswrapper[4698]: I0224 10:21:01.070323 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ffa701b5-b0fd-46e8-b2dc-50daf6d2c30c-client-ca\") pod \"route-controller-manager-dd4f6d495-p98zt\" (UID: \"ffa701b5-b0fd-46e8-b2dc-50daf6d2c30c\") " pod="openshift-route-controller-manager/route-controller-manager-dd4f6d495-p98zt" Feb 24 10:21:01 crc kubenswrapper[4698]: I0224 10:21:01.070393 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffa701b5-b0fd-46e8-b2dc-50daf6d2c30c-config\") pod \"route-controller-manager-dd4f6d495-p98zt\" (UID: \"ffa701b5-b0fd-46e8-b2dc-50daf6d2c30c\") " pod="openshift-route-controller-manager/route-controller-manager-dd4f6d495-p98zt" Feb 24 10:21:01 crc kubenswrapper[4698]: I0224 10:21:01.074049 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ffa701b5-b0fd-46e8-b2dc-50daf6d2c30c-serving-cert\") pod \"route-controller-manager-dd4f6d495-p98zt\" (UID: \"ffa701b5-b0fd-46e8-b2dc-50daf6d2c30c\") " pod="openshift-route-controller-manager/route-controller-manager-dd4f6d495-p98zt" Feb 24 10:21:01 crc kubenswrapper[4698]: I0224 10:21:01.098866 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ps8rj\" (UniqueName: \"kubernetes.io/projected/ffa701b5-b0fd-46e8-b2dc-50daf6d2c30c-kube-api-access-ps8rj\") pod \"route-controller-manager-dd4f6d495-p98zt\" (UID: \"ffa701b5-b0fd-46e8-b2dc-50daf6d2c30c\") " pod="openshift-route-controller-manager/route-controller-manager-dd4f6d495-p98zt" Feb 24 10:21:01 crc kubenswrapper[4698]: I0224 10:21:01.139367 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-dd4f6d495-p98zt" Feb 24 10:21:02 crc kubenswrapper[4698]: E0224 10:21:02.144675 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-8hhkf" podUID="2eee2a16-171b-402e-9549-3d14cb56cddc" Feb 24 10:21:02 crc kubenswrapper[4698]: E0224 10:21:02.144901 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-2z8bk" podUID="5149fd4f-19d7-4852-b09a-d9909b8231dd" Feb 24 10:21:02 crc kubenswrapper[4698]: E0224 10:21:02.171823 4698 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Feb 24 10:21:02 crc kubenswrapper[4698]: E0224 10:21:02.172045 4698 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cfcd4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-ppmk4_openshift-marketplace(55418747-8c79-496a-9b89-68f9eaa3f01a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 24 10:21:02 crc kubenswrapper[4698]: E0224 10:21:02.173200 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-ppmk4" podUID="55418747-8c79-496a-9b89-68f9eaa3f01a" Feb 24 10:21:02 crc kubenswrapper[4698]: I0224 10:21:02.178736 4698 scope.go:117] "RemoveContainer" containerID="0cb0c35e7654e88f01475f7d4e84f8f797f5e57119b632e704072c3b4092efd4" Feb 24 10:21:02 crc kubenswrapper[4698]: E0224 10:21:02.193904 4698 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Feb 24 10:21:02 crc kubenswrapper[4698]: E0224 10:21:02.194159 4698 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xn7xk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-l7zvp_openshift-marketplace(291ba94f-a9ac-4d5c-8476-221496078d80): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 24 10:21:02 crc kubenswrapper[4698]: E0224 10:21:02.195442 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-l7zvp" podUID="291ba94f-a9ac-4d5c-8476-221496078d80" Feb 24 10:21:02 crc kubenswrapper[4698]: I0224 10:21:02.537044 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5795d455f6-j5sh9"] Feb 24 10:21:02 crc kubenswrapper[4698]: W0224 10:21:02.550172 4698 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeeb6c0fe_63cc_404a_a628_885660e52dc9.slice/crio-16523f52975518510c2e486e4ec8eb3b8282fe9f1a2573e86b0fd7e925f158f9 WatchSource:0}: Error finding container 16523f52975518510c2e486e4ec8eb3b8282fe9f1a2573e86b0fd7e925f158f9: Status 404 returned error can't find the container with id 16523f52975518510c2e486e4ec8eb3b8282fe9f1a2573e86b0fd7e925f158f9 Feb 24 10:21:02 crc kubenswrapper[4698]: I0224 10:21:02.616523 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 24 10:21:02 crc kubenswrapper[4698]: I0224 10:21:02.672511 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-dd4f6d495-p98zt"] Feb 24 10:21:02 crc kubenswrapper[4698]: W0224 10:21:02.678424 4698 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podffa701b5_b0fd_46e8_b2dc_50daf6d2c30c.slice/crio-3326bb2cc9a71a1dfbb7030154ed607cb3e316c8fd2f35577f999b8f5c67505a WatchSource:0}: Error finding container 3326bb2cc9a71a1dfbb7030154ed607cb3e316c8fd2f35577f999b8f5c67505a: Status 404 returned error can't find the container with id 3326bb2cc9a71a1dfbb7030154ed607cb3e316c8fd2f35577f999b8f5c67505a Feb 24 10:21:02 crc kubenswrapper[4698]: I0224 10:21:02.698438 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 24 10:21:02 crc kubenswrapper[4698]: I0224 10:21:02.988427 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-z42jf" event={"ID":"108d72f5-0dd9-4965-a41f-7403ad8fce04","Type":"ContainerStarted","Data":"6c2c7ccab24469b32121c74d1c8c842e13816e7abbf46104f44583f1c2729bca"} Feb 24 10:21:02 crc kubenswrapper[4698]: I0224 10:21:02.988968 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-z42jf" Feb 24 10:21:02 crc kubenswrapper[4698]: I0224 10:21:02.989677 4698 patch_prober.go:28] interesting pod/downloads-7954f5f757-z42jf container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" start-of-body= Feb 24 10:21:02 crc kubenswrapper[4698]: I0224 10:21:02.989751 4698 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-z42jf" podUID="108d72f5-0dd9-4965-a41f-7403ad8fce04" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" Feb 24 10:21:02 crc kubenswrapper[4698]: I0224 10:21:02.991599 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-nn578" event={"ID":"b4ee0bb1-125d-4852-a54d-7dadf6177545","Type":"ContainerStarted","Data":"c26c2143394059f82ffa50e03f99ae3948741b5030a14c47db3d70836dce763e"} Feb 24 10:21:02 crc kubenswrapper[4698]: I0224 10:21:02.998177 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"4928b0cb-1f5d-4242-96e1-014e01e5d2d4","Type":"ContainerStarted","Data":"1775780dbc8c034221606971e3745aca410a678dc9ad19d84993d536ee5593fe"} Feb 24 10:21:02 crc kubenswrapper[4698]: I0224 10:21:02.998234 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"4928b0cb-1f5d-4242-96e1-014e01e5d2d4","Type":"ContainerStarted","Data":"7dae8ebe496c6758cb42e1fe3401a6b44de0a229064af119cd919cbfb2847f95"} Feb 24 10:21:03 crc kubenswrapper[4698]: I0224 10:21:03.013625 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-dd4f6d495-p98zt" event={"ID":"ffa701b5-b0fd-46e8-b2dc-50daf6d2c30c","Type":"ContainerStarted","Data":"5858cf29ef2ced6489a5acee1e8fd833c91d868928118e8675dd8157df0455d8"} Feb 24 10:21:03 crc kubenswrapper[4698]: I0224 10:21:03.013672 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-dd4f6d495-p98zt" event={"ID":"ffa701b5-b0fd-46e8-b2dc-50daf6d2c30c","Type":"ContainerStarted","Data":"3326bb2cc9a71a1dfbb7030154ed607cb3e316c8fd2f35577f999b8f5c67505a"} Feb 24 10:21:03 crc kubenswrapper[4698]: I0224 10:21:03.014760 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-dd4f6d495-p98zt" Feb 24 10:21:03 crc kubenswrapper[4698]: I0224 10:21:03.016374 4698 patch_prober.go:28] interesting pod/route-controller-manager-dd4f6d495-p98zt container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.60:8443/healthz\": dial tcp 10.217.0.60:8443: connect: connection refused" start-of-body= Feb 24 10:21:03 crc kubenswrapper[4698]: I0224 10:21:03.016420 4698 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-dd4f6d495-p98zt" podUID="ffa701b5-b0fd-46e8-b2dc-50daf6d2c30c" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.60:8443/healthz\": dial tcp 10.217.0.60:8443: connect: connection refused" Feb 24 10:21:03 crc kubenswrapper[4698]: I0224 10:21:03.018805 4698 generic.go:334] "Generic (PLEG): container finished" podID="d16dadf6-b01e-4bda-b24b-d63801c9bf23" containerID="579c0dfd2bc9c72c6249d1bf3426c021e097a658cf8c9e3758c64d729c32acfa" exitCode=0 Feb 24 10:21:03 crc kubenswrapper[4698]: I0224 10:21:03.018857 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-59zd2" event={"ID":"d16dadf6-b01e-4bda-b24b-d63801c9bf23","Type":"ContainerDied","Data":"579c0dfd2bc9c72c6249d1bf3426c021e097a658cf8c9e3758c64d729c32acfa"} Feb 24 10:21:03 crc kubenswrapper[4698]: I0224 10:21:03.031582 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5795d455f6-j5sh9" event={"ID":"eeb6c0fe-63cc-404a-a628-885660e52dc9","Type":"ContainerStarted","Data":"4e2bc83f09c668ba6607f672fe6aea56bb2caab708cf27f90129490b484560a9"} Feb 24 10:21:03 crc kubenswrapper[4698]: I0224 10:21:03.031784 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5795d455f6-j5sh9" event={"ID":"eeb6c0fe-63cc-404a-a628-885660e52dc9","Type":"ContainerStarted","Data":"16523f52975518510c2e486e4ec8eb3b8282fe9f1a2573e86b0fd7e925f158f9"} Feb 24 10:21:03 crc kubenswrapper[4698]: I0224 10:21:03.032111 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5795d455f6-j5sh9" Feb 24 10:21:03 crc kubenswrapper[4698]: I0224 10:21:03.039687 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"d03f18c4-57c2-4d45-9b43-0b4fbf8f4a41","Type":"ContainerStarted","Data":"b640f84e4c03d14b5ab00b92bbace7ae599546229964e6de8fa8e0075249fae1"} Feb 24 10:21:03 crc kubenswrapper[4698]: I0224 10:21:03.045961 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5795d455f6-j5sh9" Feb 24 10:21:03 crc kubenswrapper[4698]: I0224 10:21:03.048600 4698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=16.048584349 podStartE2EDuration="16.048584349s" podCreationTimestamp="2026-02-24 10:20:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:21:03.037945586 +0000 UTC m=+288.151559827" watchObservedRunningTime="2026-02-24 10:21:03.048584349 +0000 UTC m=+288.162198590" Feb 24 10:21:03 crc kubenswrapper[4698]: I0224 10:21:03.112400 4698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-dd4f6d495-p98zt" podStartSLOduration=14.112384898 podStartE2EDuration="14.112384898s" podCreationTimestamp="2026-02-24 10:20:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:21:03.109953451 +0000 UTC m=+288.223567692" watchObservedRunningTime="2026-02-24 10:21:03.112384898 +0000 UTC m=+288.225999139" Feb 24 10:21:03 crc kubenswrapper[4698]: I0224 10:21:03.128776 4698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5795d455f6-j5sh9" podStartSLOduration=14.128762739 podStartE2EDuration="14.128762739s" podCreationTimestamp="2026-02-24 10:20:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:21:03.127417007 +0000 UTC m=+288.241031248" watchObservedRunningTime="2026-02-24 10:21:03.128762739 +0000 UTC m=+288.242376980" Feb 24 10:21:03 crc kubenswrapper[4698]: I0224 10:21:03.587040 4698 patch_prober.go:28] interesting pod/downloads-7954f5f757-z42jf container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" start-of-body= Feb 24 10:21:03 crc kubenswrapper[4698]: I0224 10:21:03.587608 4698 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-z42jf" podUID="108d72f5-0dd9-4965-a41f-7403ad8fce04" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" Feb 24 10:21:03 crc kubenswrapper[4698]: I0224 10:21:03.591911 4698 patch_prober.go:28] interesting pod/downloads-7954f5f757-z42jf container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" start-of-body= Feb 24 10:21:03 crc kubenswrapper[4698]: I0224 10:21:03.591948 4698 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-z42jf" podUID="108d72f5-0dd9-4965-a41f-7403ad8fce04" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" Feb 24 10:21:04 crc kubenswrapper[4698]: I0224 10:21:04.050199 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-59zd2" event={"ID":"d16dadf6-b01e-4bda-b24b-d63801c9bf23","Type":"ContainerStarted","Data":"e57cff86fab457265e85f02e9d87367771d094307c8b484ca39fd9ddf71b6862"} Feb 24 10:21:04 crc kubenswrapper[4698]: I0224 10:21:04.056839 4698 generic.go:334] "Generic (PLEG): container finished" podID="4928b0cb-1f5d-4242-96e1-014e01e5d2d4" containerID="1775780dbc8c034221606971e3745aca410a678dc9ad19d84993d536ee5593fe" exitCode=0 Feb 24 10:21:04 crc kubenswrapper[4698]: I0224 10:21:04.057053 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"4928b0cb-1f5d-4242-96e1-014e01e5d2d4","Type":"ContainerDied","Data":"1775780dbc8c034221606971e3745aca410a678dc9ad19d84993d536ee5593fe"} Feb 24 10:21:04 crc kubenswrapper[4698]: I0224 10:21:04.059612 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"d03f18c4-57c2-4d45-9b43-0b4fbf8f4a41","Type":"ContainerStarted","Data":"a1f5c477918c58606fd4bbe2b3fe11db67e3179d6f4fbb24d7058ff653e47859"} Feb 24 10:21:04 crc kubenswrapper[4698]: I0224 10:21:04.062184 4698 patch_prober.go:28] interesting pod/downloads-7954f5f757-z42jf container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" start-of-body= Feb 24 10:21:04 crc kubenswrapper[4698]: I0224 10:21:04.062355 4698 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-z42jf" podUID="108d72f5-0dd9-4965-a41f-7403ad8fce04" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" Feb 24 10:21:04 crc kubenswrapper[4698]: I0224 10:21:04.064490 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-dd4f6d495-p98zt" Feb 24 10:21:04 crc kubenswrapper[4698]: I0224 10:21:04.077128 4698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-59zd2" podStartSLOduration=2.172696805 podStartE2EDuration="53.077100537s" podCreationTimestamp="2026-02-24 10:20:11 +0000 UTC" firstStartedPulling="2026-02-24 10:20:12.675424341 +0000 UTC m=+237.789038582" lastFinishedPulling="2026-02-24 10:21:03.579828073 +0000 UTC m=+288.693442314" observedRunningTime="2026-02-24 10:21:04.072475207 +0000 UTC m=+289.186089468" watchObservedRunningTime="2026-02-24 10:21:04.077100537 +0000 UTC m=+289.190714808" Feb 24 10:21:04 crc kubenswrapper[4698]: I0224 10:21:04.126190 4698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=12.126167767 podStartE2EDuration="12.126167767s" podCreationTimestamp="2026-02-24 10:20:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:21:04.118390651 +0000 UTC m=+289.232004912" watchObservedRunningTime="2026-02-24 10:21:04.126167767 +0000 UTC m=+289.239782018" Feb 24 10:21:05 crc kubenswrapper[4698]: I0224 10:21:05.069469 4698 patch_prober.go:28] interesting pod/downloads-7954f5f757-z42jf container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" start-of-body= Feb 24 10:21:05 crc kubenswrapper[4698]: I0224 10:21:05.070674 4698 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-z42jf" podUID="108d72f5-0dd9-4965-a41f-7403ad8fce04" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" Feb 24 10:21:05 crc kubenswrapper[4698]: I0224 10:21:05.460482 4698 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 24 10:21:05 crc kubenswrapper[4698]: I0224 10:21:05.490085 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4928b0cb-1f5d-4242-96e1-014e01e5d2d4-kubelet-dir\") pod \"4928b0cb-1f5d-4242-96e1-014e01e5d2d4\" (UID: \"4928b0cb-1f5d-4242-96e1-014e01e5d2d4\") " Feb 24 10:21:05 crc kubenswrapper[4698]: I0224 10:21:05.490162 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4928b0cb-1f5d-4242-96e1-014e01e5d2d4-kube-api-access\") pod \"4928b0cb-1f5d-4242-96e1-014e01e5d2d4\" (UID: \"4928b0cb-1f5d-4242-96e1-014e01e5d2d4\") " Feb 24 10:21:05 crc kubenswrapper[4698]: I0224 10:21:05.491455 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4928b0cb-1f5d-4242-96e1-014e01e5d2d4-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "4928b0cb-1f5d-4242-96e1-014e01e5d2d4" (UID: "4928b0cb-1f5d-4242-96e1-014e01e5d2d4"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 10:21:05 crc kubenswrapper[4698]: I0224 10:21:05.509463 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4928b0cb-1f5d-4242-96e1-014e01e5d2d4-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "4928b0cb-1f5d-4242-96e1-014e01e5d2d4" (UID: "4928b0cb-1f5d-4242-96e1-014e01e5d2d4"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:21:05 crc kubenswrapper[4698]: I0224 10:21:05.591515 4698 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4928b0cb-1f5d-4242-96e1-014e01e5d2d4-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 24 10:21:05 crc kubenswrapper[4698]: I0224 10:21:05.591561 4698 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4928b0cb-1f5d-4242-96e1-014e01e5d2d4-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 24 10:21:06 crc kubenswrapper[4698]: I0224 10:21:06.073461 4698 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 24 10:21:06 crc kubenswrapper[4698]: I0224 10:21:06.073465 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"4928b0cb-1f5d-4242-96e1-014e01e5d2d4","Type":"ContainerDied","Data":"7dae8ebe496c6758cb42e1fe3401a6b44de0a229064af119cd919cbfb2847f95"} Feb 24 10:21:06 crc kubenswrapper[4698]: I0224 10:21:06.073506 4698 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7dae8ebe496c6758cb42e1fe3401a6b44de0a229064af119cd919cbfb2847f95" Feb 24 10:21:07 crc kubenswrapper[4698]: I0224 10:21:07.787039 4698 ???:1] "http: TLS handshake error from 192.168.126.11:36398: no serving certificate available for the kubelet" Feb 24 10:21:11 crc kubenswrapper[4698]: I0224 10:21:11.620891 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-59zd2" Feb 24 10:21:11 crc kubenswrapper[4698]: I0224 10:21:11.621369 4698 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-59zd2" Feb 24 10:21:12 crc kubenswrapper[4698]: I0224 10:21:12.571711 4698 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-59zd2" Feb 24 10:21:12 crc kubenswrapper[4698]: I0224 10:21:12.672913 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-59zd2" Feb 24 10:21:13 crc kubenswrapper[4698]: I0224 10:21:13.245383 4698 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-59zd2"] Feb 24 10:21:13 crc kubenswrapper[4698]: I0224 10:21:13.593415 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-z42jf" Feb 24 10:21:14 crc kubenswrapper[4698]: I0224 10:21:14.112116 4698 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-59zd2" podUID="d16dadf6-b01e-4bda-b24b-d63801c9bf23" containerName="registry-server" containerID="cri-o://e57cff86fab457265e85f02e9d87367771d094307c8b484ca39fd9ddf71b6862" gracePeriod=2 Feb 24 10:21:16 crc kubenswrapper[4698]: I0224 10:21:16.125321 4698 generic.go:334] "Generic (PLEG): container finished" podID="d16dadf6-b01e-4bda-b24b-d63801c9bf23" containerID="e57cff86fab457265e85f02e9d87367771d094307c8b484ca39fd9ddf71b6862" exitCode=0 Feb 24 10:21:16 crc kubenswrapper[4698]: I0224 10:21:16.125472 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-59zd2" event={"ID":"d16dadf6-b01e-4bda-b24b-d63801c9bf23","Type":"ContainerDied","Data":"e57cff86fab457265e85f02e9d87367771d094307c8b484ca39fd9ddf71b6862"} Feb 24 10:21:17 crc kubenswrapper[4698]: I0224 10:21:17.940307 4698 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-59zd2" Feb 24 10:21:17 crc kubenswrapper[4698]: I0224 10:21:17.965696 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hhsl8\" (UniqueName: \"kubernetes.io/projected/d16dadf6-b01e-4bda-b24b-d63801c9bf23-kube-api-access-hhsl8\") pod \"d16dadf6-b01e-4bda-b24b-d63801c9bf23\" (UID: \"d16dadf6-b01e-4bda-b24b-d63801c9bf23\") " Feb 24 10:21:17 crc kubenswrapper[4698]: I0224 10:21:17.966125 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d16dadf6-b01e-4bda-b24b-d63801c9bf23-utilities\") pod \"d16dadf6-b01e-4bda-b24b-d63801c9bf23\" (UID: \"d16dadf6-b01e-4bda-b24b-d63801c9bf23\") " Feb 24 10:21:17 crc kubenswrapper[4698]: I0224 10:21:17.966191 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d16dadf6-b01e-4bda-b24b-d63801c9bf23-catalog-content\") pod \"d16dadf6-b01e-4bda-b24b-d63801c9bf23\" (UID: \"d16dadf6-b01e-4bda-b24b-d63801c9bf23\") " Feb 24 10:21:17 crc kubenswrapper[4698]: I0224 10:21:17.966676 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d16dadf6-b01e-4bda-b24b-d63801c9bf23-utilities" (OuterVolumeSpecName: "utilities") pod "d16dadf6-b01e-4bda-b24b-d63801c9bf23" (UID: "d16dadf6-b01e-4bda-b24b-d63801c9bf23"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 10:21:17 crc kubenswrapper[4698]: I0224 10:21:17.970434 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d16dadf6-b01e-4bda-b24b-d63801c9bf23-kube-api-access-hhsl8" (OuterVolumeSpecName: "kube-api-access-hhsl8") pod "d16dadf6-b01e-4bda-b24b-d63801c9bf23" (UID: "d16dadf6-b01e-4bda-b24b-d63801c9bf23"). InnerVolumeSpecName "kube-api-access-hhsl8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:21:18 crc kubenswrapper[4698]: I0224 10:21:18.067044 4698 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hhsl8\" (UniqueName: \"kubernetes.io/projected/d16dadf6-b01e-4bda-b24b-d63801c9bf23-kube-api-access-hhsl8\") on node \"crc\" DevicePath \"\"" Feb 24 10:21:18 crc kubenswrapper[4698]: I0224 10:21:18.067254 4698 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d16dadf6-b01e-4bda-b24b-d63801c9bf23-utilities\") on node \"crc\" DevicePath \"\"" Feb 24 10:21:18 crc kubenswrapper[4698]: I0224 10:21:18.137008 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-59zd2" event={"ID":"d16dadf6-b01e-4bda-b24b-d63801c9bf23","Type":"ContainerDied","Data":"70733949ca29acd9ad2f6c278baae0692c8eea2e136ecf93a964f64c5c7c0edd"} Feb 24 10:21:18 crc kubenswrapper[4698]: I0224 10:21:18.137054 4698 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-59zd2" Feb 24 10:21:18 crc kubenswrapper[4698]: I0224 10:21:18.137073 4698 scope.go:117] "RemoveContainer" containerID="e57cff86fab457265e85f02e9d87367771d094307c8b484ca39fd9ddf71b6862" Feb 24 10:21:19 crc kubenswrapper[4698]: I0224 10:21:19.354898 4698 scope.go:117] "RemoveContainer" containerID="579c0dfd2bc9c72c6249d1bf3426c021e097a658cf8c9e3758c64d729c32acfa" Feb 24 10:21:20 crc kubenswrapper[4698]: I0224 10:21:20.776419 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d16dadf6-b01e-4bda-b24b-d63801c9bf23-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d16dadf6-b01e-4bda-b24b-d63801c9bf23" (UID: "d16dadf6-b01e-4bda-b24b-d63801c9bf23"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 10:21:20 crc kubenswrapper[4698]: I0224 10:21:20.819420 4698 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d16dadf6-b01e-4bda-b24b-d63801c9bf23-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 24 10:21:20 crc kubenswrapper[4698]: I0224 10:21:20.867115 4698 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-59zd2"] Feb 24 10:21:20 crc kubenswrapper[4698]: I0224 10:21:20.870201 4698 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-59zd2"] Feb 24 10:21:21 crc kubenswrapper[4698]: I0224 10:21:21.626160 4698 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d16dadf6-b01e-4bda-b24b-d63801c9bf23" path="/var/lib/kubelet/pods/d16dadf6-b01e-4bda-b24b-d63801c9bf23/volumes" Feb 24 10:21:22 crc kubenswrapper[4698]: I0224 10:21:22.758245 4698 scope.go:117] "RemoveContainer" containerID="f4ea41c11e12dde4995e8ca0a2a489a9cb99bd209c4ec1ebeab649f890485aaf" Feb 24 10:21:24 crc kubenswrapper[4698]: I0224 10:21:24.173981 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8hhkf" event={"ID":"2eee2a16-171b-402e-9549-3d14cb56cddc","Type":"ContainerStarted","Data":"6c6f018a2e8183a1e6015cbe5f46a1e17f6ef288d249975edb32c951a1517fd7"} Feb 24 10:21:24 crc kubenswrapper[4698]: I0224 10:21:24.176150 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p5dwb" event={"ID":"19022af1-394c-4aab-9eb1-ffb0f566d0ac","Type":"ContainerStarted","Data":"ad84e610cc0720a3fe4e7abe2493501d6e0d3b419be2245cbf762b13501dd835"} Feb 24 10:21:24 crc kubenswrapper[4698]: I0224 10:21:24.177864 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l7zvp" event={"ID":"291ba94f-a9ac-4d5c-8476-221496078d80","Type":"ContainerStarted","Data":"47883b80e42c225646e36edfeb6751002aed2fe48b26c67e1113ba52f3dc7715"} Feb 24 10:21:24 crc kubenswrapper[4698]: I0224 10:21:24.179422 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ppmk4" event={"ID":"55418747-8c79-496a-9b89-68f9eaa3f01a","Type":"ContainerStarted","Data":"a6effa1b70ed0fb18c196662578693c690137e42eb7e7035900913f24f92668d"} Feb 24 10:21:24 crc kubenswrapper[4698]: I0224 10:21:24.181191 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dh74l" event={"ID":"6f1af873-5e8f-4f75-81c2-c9b26ee37f2a","Type":"ContainerStarted","Data":"bb77b755be2d1769cc0bb50f56c28187dbf774677a3e89580a42a9bf1f5d8981"} Feb 24 10:21:24 crc kubenswrapper[4698]: I0224 10:21:24.183712 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2z8bk" event={"ID":"5149fd4f-19d7-4852-b09a-d9909b8231dd","Type":"ContainerStarted","Data":"55ee4801ffb6fe8d2af136c3de246e460ae965ded5eeb7dff258a9366a507f2e"} Feb 24 10:21:24 crc kubenswrapper[4698]: I0224 10:21:24.186172 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p9jbm" event={"ID":"25f4eaf1-6171-44dd-b225-be712a45ba1b","Type":"ContainerStarted","Data":"3457dcc61e1e58199b3f7067d5f47baefd4b125660f958180282330d0adc4ee0"} Feb 24 10:21:25 crc kubenswrapper[4698]: I0224 10:21:25.193774 4698 generic.go:334] "Generic (PLEG): container finished" podID="5149fd4f-19d7-4852-b09a-d9909b8231dd" containerID="55ee4801ffb6fe8d2af136c3de246e460ae965ded5eeb7dff258a9366a507f2e" exitCode=0 Feb 24 10:21:25 crc kubenswrapper[4698]: I0224 10:21:25.193907 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2z8bk" event={"ID":"5149fd4f-19d7-4852-b09a-d9909b8231dd","Type":"ContainerDied","Data":"55ee4801ffb6fe8d2af136c3de246e460ae965ded5eeb7dff258a9366a507f2e"} Feb 24 10:21:25 crc kubenswrapper[4698]: I0224 10:21:25.201297 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p9jbm" event={"ID":"25f4eaf1-6171-44dd-b225-be712a45ba1b","Type":"ContainerDied","Data":"3457dcc61e1e58199b3f7067d5f47baefd4b125660f958180282330d0adc4ee0"} Feb 24 10:21:25 crc kubenswrapper[4698]: I0224 10:21:25.202346 4698 generic.go:334] "Generic (PLEG): container finished" podID="25f4eaf1-6171-44dd-b225-be712a45ba1b" containerID="3457dcc61e1e58199b3f7067d5f47baefd4b125660f958180282330d0adc4ee0" exitCode=0 Feb 24 10:21:25 crc kubenswrapper[4698]: I0224 10:21:25.206114 4698 generic.go:334] "Generic (PLEG): container finished" podID="2eee2a16-171b-402e-9549-3d14cb56cddc" containerID="6c6f018a2e8183a1e6015cbe5f46a1e17f6ef288d249975edb32c951a1517fd7" exitCode=0 Feb 24 10:21:25 crc kubenswrapper[4698]: I0224 10:21:25.206185 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8hhkf" event={"ID":"2eee2a16-171b-402e-9549-3d14cb56cddc","Type":"ContainerDied","Data":"6c6f018a2e8183a1e6015cbe5f46a1e17f6ef288d249975edb32c951a1517fd7"} Feb 24 10:21:25 crc kubenswrapper[4698]: I0224 10:21:25.208621 4698 generic.go:334] "Generic (PLEG): container finished" podID="19022af1-394c-4aab-9eb1-ffb0f566d0ac" containerID="ad84e610cc0720a3fe4e7abe2493501d6e0d3b419be2245cbf762b13501dd835" exitCode=0 Feb 24 10:21:25 crc kubenswrapper[4698]: I0224 10:21:25.209048 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p5dwb" event={"ID":"19022af1-394c-4aab-9eb1-ffb0f566d0ac","Type":"ContainerDied","Data":"ad84e610cc0720a3fe4e7abe2493501d6e0d3b419be2245cbf762b13501dd835"} Feb 24 10:21:25 crc kubenswrapper[4698]: I0224 10:21:25.213618 4698 generic.go:334] "Generic (PLEG): container finished" podID="291ba94f-a9ac-4d5c-8476-221496078d80" containerID="47883b80e42c225646e36edfeb6751002aed2fe48b26c67e1113ba52f3dc7715" exitCode=0 Feb 24 10:21:25 crc kubenswrapper[4698]: I0224 10:21:25.214158 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l7zvp" event={"ID":"291ba94f-a9ac-4d5c-8476-221496078d80","Type":"ContainerDied","Data":"47883b80e42c225646e36edfeb6751002aed2fe48b26c67e1113ba52f3dc7715"} Feb 24 10:21:25 crc kubenswrapper[4698]: I0224 10:21:25.222692 4698 generic.go:334] "Generic (PLEG): container finished" podID="6f1af873-5e8f-4f75-81c2-c9b26ee37f2a" containerID="bb77b755be2d1769cc0bb50f56c28187dbf774677a3e89580a42a9bf1f5d8981" exitCode=0 Feb 24 10:21:25 crc kubenswrapper[4698]: I0224 10:21:25.223416 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dh74l" event={"ID":"6f1af873-5e8f-4f75-81c2-c9b26ee37f2a","Type":"ContainerDied","Data":"bb77b755be2d1769cc0bb50f56c28187dbf774677a3e89580a42a9bf1f5d8981"} Feb 24 10:21:26 crc kubenswrapper[4698]: I0224 10:21:26.234686 4698 generic.go:334] "Generic (PLEG): container finished" podID="55418747-8c79-496a-9b89-68f9eaa3f01a" containerID="a6effa1b70ed0fb18c196662578693c690137e42eb7e7035900913f24f92668d" exitCode=0 Feb 24 10:21:26 crc kubenswrapper[4698]: I0224 10:21:26.234770 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ppmk4" event={"ID":"55418747-8c79-496a-9b89-68f9eaa3f01a","Type":"ContainerDied","Data":"a6effa1b70ed0fb18c196662578693c690137e42eb7e7035900913f24f92668d"} Feb 24 10:21:29 crc kubenswrapper[4698]: I0224 10:21:29.479591 4698 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5795d455f6-j5sh9"] Feb 24 10:21:29 crc kubenswrapper[4698]: I0224 10:21:29.480252 4698 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5795d455f6-j5sh9" podUID="eeb6c0fe-63cc-404a-a628-885660e52dc9" containerName="controller-manager" containerID="cri-o://4e2bc83f09c668ba6607f672fe6aea56bb2caab708cf27f90129490b484560a9" gracePeriod=30 Feb 24 10:21:29 crc kubenswrapper[4698]: I0224 10:21:29.565851 4698 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-dd4f6d495-p98zt"] Feb 24 10:21:29 crc kubenswrapper[4698]: I0224 10:21:29.566081 4698 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-dd4f6d495-p98zt" podUID="ffa701b5-b0fd-46e8-b2dc-50daf6d2c30c" containerName="route-controller-manager" containerID="cri-o://5858cf29ef2ced6489a5acee1e8fd833c91d868928118e8675dd8157df0455d8" gracePeriod=30 Feb 24 10:21:31 crc kubenswrapper[4698]: I0224 10:21:31.140093 4698 patch_prober.go:28] interesting pod/route-controller-manager-dd4f6d495-p98zt container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.60:8443/healthz\": dial tcp 10.217.0.60:8443: connect: connection refused" start-of-body= Feb 24 10:21:31 crc kubenswrapper[4698]: I0224 10:21:31.140548 4698 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-dd4f6d495-p98zt" podUID="ffa701b5-b0fd-46e8-b2dc-50daf6d2c30c" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.60:8443/healthz\": dial tcp 10.217.0.60:8443: connect: connection refused" Feb 24 10:21:31 crc kubenswrapper[4698]: I0224 10:21:31.269685 4698 generic.go:334] "Generic (PLEG): container finished" podID="eeb6c0fe-63cc-404a-a628-885660e52dc9" containerID="4e2bc83f09c668ba6607f672fe6aea56bb2caab708cf27f90129490b484560a9" exitCode=0 Feb 24 10:21:31 crc kubenswrapper[4698]: I0224 10:21:31.269768 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5795d455f6-j5sh9" event={"ID":"eeb6c0fe-63cc-404a-a628-885660e52dc9","Type":"ContainerDied","Data":"4e2bc83f09c668ba6607f672fe6aea56bb2caab708cf27f90129490b484560a9"} Feb 24 10:21:31 crc kubenswrapper[4698]: I0224 10:21:31.271891 4698 generic.go:334] "Generic (PLEG): container finished" podID="ffa701b5-b0fd-46e8-b2dc-50daf6d2c30c" containerID="5858cf29ef2ced6489a5acee1e8fd833c91d868928118e8675dd8157df0455d8" exitCode=0 Feb 24 10:21:31 crc kubenswrapper[4698]: I0224 10:21:31.271917 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-dd4f6d495-p98zt" event={"ID":"ffa701b5-b0fd-46e8-b2dc-50daf6d2c30c","Type":"ContainerDied","Data":"5858cf29ef2ced6489a5acee1e8fd833c91d868928118e8675dd8157df0455d8"} Feb 24 10:21:32 crc kubenswrapper[4698]: I0224 10:21:32.044726 4698 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5795d455f6-j5sh9" Feb 24 10:21:32 crc kubenswrapper[4698]: I0224 10:21:32.104521 4698 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-85cb469ccf-nndrr"] Feb 24 10:21:32 crc kubenswrapper[4698]: E0224 10:21:32.105102 4698 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eeb6c0fe-63cc-404a-a628-885660e52dc9" containerName="controller-manager" Feb 24 10:21:32 crc kubenswrapper[4698]: I0224 10:21:32.105125 4698 state_mem.go:107] "Deleted CPUSet assignment" podUID="eeb6c0fe-63cc-404a-a628-885660e52dc9" containerName="controller-manager" Feb 24 10:21:32 crc kubenswrapper[4698]: E0224 10:21:32.105139 4698 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d16dadf6-b01e-4bda-b24b-d63801c9bf23" containerName="extract-content" Feb 24 10:21:32 crc kubenswrapper[4698]: I0224 10:21:32.105149 4698 state_mem.go:107] "Deleted CPUSet assignment" podUID="d16dadf6-b01e-4bda-b24b-d63801c9bf23" containerName="extract-content" Feb 24 10:21:32 crc kubenswrapper[4698]: E0224 10:21:32.105159 4698 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d16dadf6-b01e-4bda-b24b-d63801c9bf23" containerName="registry-server" Feb 24 10:21:32 crc kubenswrapper[4698]: I0224 10:21:32.105167 4698 state_mem.go:107] "Deleted CPUSet assignment" podUID="d16dadf6-b01e-4bda-b24b-d63801c9bf23" containerName="registry-server" Feb 24 10:21:32 crc kubenswrapper[4698]: E0224 10:21:32.105184 4698 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4928b0cb-1f5d-4242-96e1-014e01e5d2d4" containerName="pruner" Feb 24 10:21:32 crc kubenswrapper[4698]: I0224 10:21:32.105192 4698 state_mem.go:107] "Deleted CPUSet assignment" podUID="4928b0cb-1f5d-4242-96e1-014e01e5d2d4" containerName="pruner" Feb 24 10:21:32 crc kubenswrapper[4698]: E0224 10:21:32.105202 4698 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d16dadf6-b01e-4bda-b24b-d63801c9bf23" containerName="extract-utilities" Feb 24 10:21:32 crc kubenswrapper[4698]: I0224 10:21:32.105210 4698 state_mem.go:107] "Deleted CPUSet assignment" podUID="d16dadf6-b01e-4bda-b24b-d63801c9bf23" containerName="extract-utilities" Feb 24 10:21:32 crc kubenswrapper[4698]: I0224 10:21:32.105347 4698 memory_manager.go:354] "RemoveStaleState removing state" podUID="d16dadf6-b01e-4bda-b24b-d63801c9bf23" containerName="registry-server" Feb 24 10:21:32 crc kubenswrapper[4698]: I0224 10:21:32.105365 4698 memory_manager.go:354] "RemoveStaleState removing state" podUID="4928b0cb-1f5d-4242-96e1-014e01e5d2d4" containerName="pruner" Feb 24 10:21:32 crc kubenswrapper[4698]: I0224 10:21:32.105380 4698 memory_manager.go:354] "RemoveStaleState removing state" podUID="eeb6c0fe-63cc-404a-a628-885660e52dc9" containerName="controller-manager" Feb 24 10:21:32 crc kubenswrapper[4698]: I0224 10:21:32.105806 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-85cb469ccf-nndrr" Feb 24 10:21:32 crc kubenswrapper[4698]: I0224 10:21:32.117592 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-85cb469ccf-nndrr"] Feb 24 10:21:32 crc kubenswrapper[4698]: I0224 10:21:32.184398 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eeb6c0fe-63cc-404a-a628-885660e52dc9-serving-cert\") pod \"eeb6c0fe-63cc-404a-a628-885660e52dc9\" (UID: \"eeb6c0fe-63cc-404a-a628-885660e52dc9\") " Feb 24 10:21:32 crc kubenswrapper[4698]: I0224 10:21:32.184490 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eeb6c0fe-63cc-404a-a628-885660e52dc9-config\") pod \"eeb6c0fe-63cc-404a-a628-885660e52dc9\" (UID: \"eeb6c0fe-63cc-404a-a628-885660e52dc9\") " Feb 24 10:21:32 crc kubenswrapper[4698]: I0224 10:21:32.184556 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/eeb6c0fe-63cc-404a-a628-885660e52dc9-client-ca\") pod \"eeb6c0fe-63cc-404a-a628-885660e52dc9\" (UID: \"eeb6c0fe-63cc-404a-a628-885660e52dc9\") " Feb 24 10:21:32 crc kubenswrapper[4698]: I0224 10:21:32.184592 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/eeb6c0fe-63cc-404a-a628-885660e52dc9-proxy-ca-bundles\") pod \"eeb6c0fe-63cc-404a-a628-885660e52dc9\" (UID: \"eeb6c0fe-63cc-404a-a628-885660e52dc9\") " Feb 24 10:21:32 crc kubenswrapper[4698]: I0224 10:21:32.184636 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7jd2\" (UniqueName: \"kubernetes.io/projected/eeb6c0fe-63cc-404a-a628-885660e52dc9-kube-api-access-w7jd2\") pod \"eeb6c0fe-63cc-404a-a628-885660e52dc9\" (UID: \"eeb6c0fe-63cc-404a-a628-885660e52dc9\") " Feb 24 10:21:32 crc kubenswrapper[4698]: I0224 10:21:32.184797 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e266bb2f-40eb-4da2-9767-0a300c8dc27b-serving-cert\") pod \"controller-manager-85cb469ccf-nndrr\" (UID: \"e266bb2f-40eb-4da2-9767-0a300c8dc27b\") " pod="openshift-controller-manager/controller-manager-85cb469ccf-nndrr" Feb 24 10:21:32 crc kubenswrapper[4698]: I0224 10:21:32.184840 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e266bb2f-40eb-4da2-9767-0a300c8dc27b-config\") pod \"controller-manager-85cb469ccf-nndrr\" (UID: \"e266bb2f-40eb-4da2-9767-0a300c8dc27b\") " pod="openshift-controller-manager/controller-manager-85cb469ccf-nndrr" Feb 24 10:21:32 crc kubenswrapper[4698]: I0224 10:21:32.184864 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfrc4\" (UniqueName: \"kubernetes.io/projected/e266bb2f-40eb-4da2-9767-0a300c8dc27b-kube-api-access-sfrc4\") pod \"controller-manager-85cb469ccf-nndrr\" (UID: \"e266bb2f-40eb-4da2-9767-0a300c8dc27b\") " pod="openshift-controller-manager/controller-manager-85cb469ccf-nndrr" Feb 24 10:21:32 crc kubenswrapper[4698]: I0224 10:21:32.184921 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e266bb2f-40eb-4da2-9767-0a300c8dc27b-client-ca\") pod \"controller-manager-85cb469ccf-nndrr\" (UID: \"e266bb2f-40eb-4da2-9767-0a300c8dc27b\") " pod="openshift-controller-manager/controller-manager-85cb469ccf-nndrr" Feb 24 10:21:32 crc kubenswrapper[4698]: I0224 10:21:32.185033 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e266bb2f-40eb-4da2-9767-0a300c8dc27b-proxy-ca-bundles\") pod \"controller-manager-85cb469ccf-nndrr\" (UID: \"e266bb2f-40eb-4da2-9767-0a300c8dc27b\") " pod="openshift-controller-manager/controller-manager-85cb469ccf-nndrr" Feb 24 10:21:32 crc kubenswrapper[4698]: I0224 10:21:32.185346 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eeb6c0fe-63cc-404a-a628-885660e52dc9-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "eeb6c0fe-63cc-404a-a628-885660e52dc9" (UID: "eeb6c0fe-63cc-404a-a628-885660e52dc9"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:21:32 crc kubenswrapper[4698]: I0224 10:21:32.185620 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eeb6c0fe-63cc-404a-a628-885660e52dc9-config" (OuterVolumeSpecName: "config") pod "eeb6c0fe-63cc-404a-a628-885660e52dc9" (UID: "eeb6c0fe-63cc-404a-a628-885660e52dc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:21:32 crc kubenswrapper[4698]: I0224 10:21:32.186074 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eeb6c0fe-63cc-404a-a628-885660e52dc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "eeb6c0fe-63cc-404a-a628-885660e52dc9" (UID: "eeb6c0fe-63cc-404a-a628-885660e52dc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:21:32 crc kubenswrapper[4698]: I0224 10:21:32.195465 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eeb6c0fe-63cc-404a-a628-885660e52dc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "eeb6c0fe-63cc-404a-a628-885660e52dc9" (UID: "eeb6c0fe-63cc-404a-a628-885660e52dc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:21:32 crc kubenswrapper[4698]: I0224 10:21:32.195513 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eeb6c0fe-63cc-404a-a628-885660e52dc9-kube-api-access-w7jd2" (OuterVolumeSpecName: "kube-api-access-w7jd2") pod "eeb6c0fe-63cc-404a-a628-885660e52dc9" (UID: "eeb6c0fe-63cc-404a-a628-885660e52dc9"). InnerVolumeSpecName "kube-api-access-w7jd2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:21:32 crc kubenswrapper[4698]: I0224 10:21:32.281573 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5795d455f6-j5sh9" event={"ID":"eeb6c0fe-63cc-404a-a628-885660e52dc9","Type":"ContainerDied","Data":"16523f52975518510c2e486e4ec8eb3b8282fe9f1a2573e86b0fd7e925f158f9"} Feb 24 10:21:32 crc kubenswrapper[4698]: I0224 10:21:32.281650 4698 scope.go:117] "RemoveContainer" containerID="4e2bc83f09c668ba6607f672fe6aea56bb2caab708cf27f90129490b484560a9" Feb 24 10:21:32 crc kubenswrapper[4698]: I0224 10:21:32.281660 4698 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5795d455f6-j5sh9" Feb 24 10:21:32 crc kubenswrapper[4698]: I0224 10:21:32.285882 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e266bb2f-40eb-4da2-9767-0a300c8dc27b-config\") pod \"controller-manager-85cb469ccf-nndrr\" (UID: \"e266bb2f-40eb-4da2-9767-0a300c8dc27b\") " pod="openshift-controller-manager/controller-manager-85cb469ccf-nndrr" Feb 24 10:21:32 crc kubenswrapper[4698]: I0224 10:21:32.285926 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfrc4\" (UniqueName: \"kubernetes.io/projected/e266bb2f-40eb-4da2-9767-0a300c8dc27b-kube-api-access-sfrc4\") pod \"controller-manager-85cb469ccf-nndrr\" (UID: \"e266bb2f-40eb-4da2-9767-0a300c8dc27b\") " pod="openshift-controller-manager/controller-manager-85cb469ccf-nndrr" Feb 24 10:21:32 crc kubenswrapper[4698]: I0224 10:21:32.285980 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e266bb2f-40eb-4da2-9767-0a300c8dc27b-client-ca\") pod \"controller-manager-85cb469ccf-nndrr\" (UID: \"e266bb2f-40eb-4da2-9767-0a300c8dc27b\") " pod="openshift-controller-manager/controller-manager-85cb469ccf-nndrr" Feb 24 10:21:32 crc kubenswrapper[4698]: I0224 10:21:32.286026 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e266bb2f-40eb-4da2-9767-0a300c8dc27b-proxy-ca-bundles\") pod \"controller-manager-85cb469ccf-nndrr\" (UID: \"e266bb2f-40eb-4da2-9767-0a300c8dc27b\") " pod="openshift-controller-manager/controller-manager-85cb469ccf-nndrr" Feb 24 10:21:32 crc kubenswrapper[4698]: I0224 10:21:32.286084 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e266bb2f-40eb-4da2-9767-0a300c8dc27b-serving-cert\") pod \"controller-manager-85cb469ccf-nndrr\" (UID: \"e266bb2f-40eb-4da2-9767-0a300c8dc27b\") " pod="openshift-controller-manager/controller-manager-85cb469ccf-nndrr" Feb 24 10:21:32 crc kubenswrapper[4698]: I0224 10:21:32.286143 4698 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7jd2\" (UniqueName: \"kubernetes.io/projected/eeb6c0fe-63cc-404a-a628-885660e52dc9-kube-api-access-w7jd2\") on node \"crc\" DevicePath \"\"" Feb 24 10:21:32 crc kubenswrapper[4698]: I0224 10:21:32.286164 4698 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eeb6c0fe-63cc-404a-a628-885660e52dc9-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 10:21:32 crc kubenswrapper[4698]: I0224 10:21:32.286181 4698 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eeb6c0fe-63cc-404a-a628-885660e52dc9-config\") on node \"crc\" DevicePath \"\"" Feb 24 10:21:32 crc kubenswrapper[4698]: I0224 10:21:32.286200 4698 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/eeb6c0fe-63cc-404a-a628-885660e52dc9-client-ca\") on node \"crc\" DevicePath \"\"" Feb 24 10:21:32 crc kubenswrapper[4698]: I0224 10:21:32.286215 4698 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/eeb6c0fe-63cc-404a-a628-885660e52dc9-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 24 10:21:32 crc kubenswrapper[4698]: I0224 10:21:32.287450 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e266bb2f-40eb-4da2-9767-0a300c8dc27b-proxy-ca-bundles\") pod \"controller-manager-85cb469ccf-nndrr\" (UID: \"e266bb2f-40eb-4da2-9767-0a300c8dc27b\") " pod="openshift-controller-manager/controller-manager-85cb469ccf-nndrr" Feb 24 10:21:32 crc kubenswrapper[4698]: I0224 10:21:32.287794 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e266bb2f-40eb-4da2-9767-0a300c8dc27b-client-ca\") pod \"controller-manager-85cb469ccf-nndrr\" (UID: \"e266bb2f-40eb-4da2-9767-0a300c8dc27b\") " pod="openshift-controller-manager/controller-manager-85cb469ccf-nndrr" Feb 24 10:21:32 crc kubenswrapper[4698]: I0224 10:21:32.287897 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e266bb2f-40eb-4da2-9767-0a300c8dc27b-config\") pod \"controller-manager-85cb469ccf-nndrr\" (UID: \"e266bb2f-40eb-4da2-9767-0a300c8dc27b\") " pod="openshift-controller-manager/controller-manager-85cb469ccf-nndrr" Feb 24 10:21:32 crc kubenswrapper[4698]: I0224 10:21:32.292350 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e266bb2f-40eb-4da2-9767-0a300c8dc27b-serving-cert\") pod \"controller-manager-85cb469ccf-nndrr\" (UID: \"e266bb2f-40eb-4da2-9767-0a300c8dc27b\") " pod="openshift-controller-manager/controller-manager-85cb469ccf-nndrr" Feb 24 10:21:32 crc kubenswrapper[4698]: I0224 10:21:32.320033 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfrc4\" (UniqueName: \"kubernetes.io/projected/e266bb2f-40eb-4da2-9767-0a300c8dc27b-kube-api-access-sfrc4\") pod \"controller-manager-85cb469ccf-nndrr\" (UID: \"e266bb2f-40eb-4da2-9767-0a300c8dc27b\") " pod="openshift-controller-manager/controller-manager-85cb469ccf-nndrr" Feb 24 10:21:32 crc kubenswrapper[4698]: I0224 10:21:32.342286 4698 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5795d455f6-j5sh9"] Feb 24 10:21:32 crc kubenswrapper[4698]: I0224 10:21:32.347789 4698 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5795d455f6-j5sh9"] Feb 24 10:21:32 crc kubenswrapper[4698]: I0224 10:21:32.430873 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-85cb469ccf-nndrr" Feb 24 10:21:33 crc kubenswrapper[4698]: I0224 10:21:33.625116 4698 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-dd4f6d495-p98zt" Feb 24 10:21:33 crc kubenswrapper[4698]: I0224 10:21:33.628202 4698 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eeb6c0fe-63cc-404a-a628-885660e52dc9" path="/var/lib/kubelet/pods/eeb6c0fe-63cc-404a-a628-885660e52dc9/volumes" Feb 24 10:21:33 crc kubenswrapper[4698]: I0224 10:21:33.709751 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ps8rj\" (UniqueName: \"kubernetes.io/projected/ffa701b5-b0fd-46e8-b2dc-50daf6d2c30c-kube-api-access-ps8rj\") pod \"ffa701b5-b0fd-46e8-b2dc-50daf6d2c30c\" (UID: \"ffa701b5-b0fd-46e8-b2dc-50daf6d2c30c\") " Feb 24 10:21:33 crc kubenswrapper[4698]: I0224 10:21:33.709898 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffa701b5-b0fd-46e8-b2dc-50daf6d2c30c-config\") pod \"ffa701b5-b0fd-46e8-b2dc-50daf6d2c30c\" (UID: \"ffa701b5-b0fd-46e8-b2dc-50daf6d2c30c\") " Feb 24 10:21:33 crc kubenswrapper[4698]: I0224 10:21:33.710031 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ffa701b5-b0fd-46e8-b2dc-50daf6d2c30c-client-ca\") pod \"ffa701b5-b0fd-46e8-b2dc-50daf6d2c30c\" (UID: \"ffa701b5-b0fd-46e8-b2dc-50daf6d2c30c\") " Feb 24 10:21:33 crc kubenswrapper[4698]: I0224 10:21:33.710094 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ffa701b5-b0fd-46e8-b2dc-50daf6d2c30c-serving-cert\") pod \"ffa701b5-b0fd-46e8-b2dc-50daf6d2c30c\" (UID: \"ffa701b5-b0fd-46e8-b2dc-50daf6d2c30c\") " Feb 24 10:21:33 crc kubenswrapper[4698]: I0224 10:21:33.711244 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ffa701b5-b0fd-46e8-b2dc-50daf6d2c30c-config" (OuterVolumeSpecName: "config") pod "ffa701b5-b0fd-46e8-b2dc-50daf6d2c30c" (UID: "ffa701b5-b0fd-46e8-b2dc-50daf6d2c30c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:21:33 crc kubenswrapper[4698]: I0224 10:21:33.711895 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ffa701b5-b0fd-46e8-b2dc-50daf6d2c30c-client-ca" (OuterVolumeSpecName: "client-ca") pod "ffa701b5-b0fd-46e8-b2dc-50daf6d2c30c" (UID: "ffa701b5-b0fd-46e8-b2dc-50daf6d2c30c"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:21:33 crc kubenswrapper[4698]: I0224 10:21:33.714031 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffa701b5-b0fd-46e8-b2dc-50daf6d2c30c-kube-api-access-ps8rj" (OuterVolumeSpecName: "kube-api-access-ps8rj") pod "ffa701b5-b0fd-46e8-b2dc-50daf6d2c30c" (UID: "ffa701b5-b0fd-46e8-b2dc-50daf6d2c30c"). InnerVolumeSpecName "kube-api-access-ps8rj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:21:33 crc kubenswrapper[4698]: I0224 10:21:33.715151 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffa701b5-b0fd-46e8-b2dc-50daf6d2c30c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ffa701b5-b0fd-46e8-b2dc-50daf6d2c30c" (UID: "ffa701b5-b0fd-46e8-b2dc-50daf6d2c30c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:21:33 crc kubenswrapper[4698]: I0224 10:21:33.811874 4698 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffa701b5-b0fd-46e8-b2dc-50daf6d2c30c-config\") on node \"crc\" DevicePath \"\"" Feb 24 10:21:33 crc kubenswrapper[4698]: I0224 10:21:33.811927 4698 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ffa701b5-b0fd-46e8-b2dc-50daf6d2c30c-client-ca\") on node \"crc\" DevicePath \"\"" Feb 24 10:21:33 crc kubenswrapper[4698]: I0224 10:21:33.811953 4698 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ffa701b5-b0fd-46e8-b2dc-50daf6d2c30c-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 10:21:33 crc kubenswrapper[4698]: I0224 10:21:33.811973 4698 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ps8rj\" (UniqueName: \"kubernetes.io/projected/ffa701b5-b0fd-46e8-b2dc-50daf6d2c30c-kube-api-access-ps8rj\") on node \"crc\" DevicePath \"\"" Feb 24 10:21:34 crc kubenswrapper[4698]: I0224 10:21:34.297961 4698 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-dd4f6d495-p98zt" Feb 24 10:21:34 crc kubenswrapper[4698]: I0224 10:21:34.297994 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-dd4f6d495-p98zt" event={"ID":"ffa701b5-b0fd-46e8-b2dc-50daf6d2c30c","Type":"ContainerDied","Data":"3326bb2cc9a71a1dfbb7030154ed607cb3e316c8fd2f35577f999b8f5c67505a"} Feb 24 10:21:34 crc kubenswrapper[4698]: I0224 10:21:34.345989 4698 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-dd4f6d495-p98zt"] Feb 24 10:21:34 crc kubenswrapper[4698]: I0224 10:21:34.351600 4698 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-dd4f6d495-p98zt"] Feb 24 10:21:34 crc kubenswrapper[4698]: I0224 10:21:34.842333 4698 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-695f7b9b5-pqzw6"] Feb 24 10:21:34 crc kubenswrapper[4698]: E0224 10:21:34.842637 4698 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffa701b5-b0fd-46e8-b2dc-50daf6d2c30c" containerName="route-controller-manager" Feb 24 10:21:34 crc kubenswrapper[4698]: I0224 10:21:34.843103 4698 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffa701b5-b0fd-46e8-b2dc-50daf6d2c30c" containerName="route-controller-manager" Feb 24 10:21:34 crc kubenswrapper[4698]: I0224 10:21:34.843419 4698 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffa701b5-b0fd-46e8-b2dc-50daf6d2c30c" containerName="route-controller-manager" Feb 24 10:21:34 crc kubenswrapper[4698]: I0224 10:21:34.844130 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-695f7b9b5-pqzw6" Feb 24 10:21:34 crc kubenswrapper[4698]: I0224 10:21:34.852586 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 24 10:21:34 crc kubenswrapper[4698]: I0224 10:21:34.853505 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 24 10:21:34 crc kubenswrapper[4698]: I0224 10:21:34.853551 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 24 10:21:34 crc kubenswrapper[4698]: I0224 10:21:34.854389 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 24 10:21:34 crc kubenswrapper[4698]: I0224 10:21:34.854705 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 24 10:21:34 crc kubenswrapper[4698]: I0224 10:21:34.855332 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 24 10:21:34 crc kubenswrapper[4698]: I0224 10:21:34.892003 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-695f7b9b5-pqzw6"] Feb 24 10:21:34 crc kubenswrapper[4698]: I0224 10:21:34.928392 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/101b40a6-d373-47b1-83f5-b5bf8bd579c8-config\") pod \"route-controller-manager-695f7b9b5-pqzw6\" (UID: \"101b40a6-d373-47b1-83f5-b5bf8bd579c8\") " pod="openshift-route-controller-manager/route-controller-manager-695f7b9b5-pqzw6" Feb 24 10:21:34 crc kubenswrapper[4698]: I0224 10:21:34.928601 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/101b40a6-d373-47b1-83f5-b5bf8bd579c8-serving-cert\") pod \"route-controller-manager-695f7b9b5-pqzw6\" (UID: \"101b40a6-d373-47b1-83f5-b5bf8bd579c8\") " pod="openshift-route-controller-manager/route-controller-manager-695f7b9b5-pqzw6" Feb 24 10:21:34 crc kubenswrapper[4698]: I0224 10:21:34.928754 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsvgh\" (UniqueName: \"kubernetes.io/projected/101b40a6-d373-47b1-83f5-b5bf8bd579c8-kube-api-access-xsvgh\") pod \"route-controller-manager-695f7b9b5-pqzw6\" (UID: \"101b40a6-d373-47b1-83f5-b5bf8bd579c8\") " pod="openshift-route-controller-manager/route-controller-manager-695f7b9b5-pqzw6" Feb 24 10:21:34 crc kubenswrapper[4698]: I0224 10:21:34.928809 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/101b40a6-d373-47b1-83f5-b5bf8bd579c8-client-ca\") pod \"route-controller-manager-695f7b9b5-pqzw6\" (UID: \"101b40a6-d373-47b1-83f5-b5bf8bd579c8\") " pod="openshift-route-controller-manager/route-controller-manager-695f7b9b5-pqzw6" Feb 24 10:21:35 crc kubenswrapper[4698]: I0224 10:21:35.029943 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/101b40a6-d373-47b1-83f5-b5bf8bd579c8-config\") pod \"route-controller-manager-695f7b9b5-pqzw6\" (UID: \"101b40a6-d373-47b1-83f5-b5bf8bd579c8\") " pod="openshift-route-controller-manager/route-controller-manager-695f7b9b5-pqzw6" Feb 24 10:21:35 crc kubenswrapper[4698]: I0224 10:21:35.030004 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/101b40a6-d373-47b1-83f5-b5bf8bd579c8-serving-cert\") pod \"route-controller-manager-695f7b9b5-pqzw6\" (UID: \"101b40a6-d373-47b1-83f5-b5bf8bd579c8\") " pod="openshift-route-controller-manager/route-controller-manager-695f7b9b5-pqzw6" Feb 24 10:21:35 crc kubenswrapper[4698]: I0224 10:21:35.030039 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xsvgh\" (UniqueName: \"kubernetes.io/projected/101b40a6-d373-47b1-83f5-b5bf8bd579c8-kube-api-access-xsvgh\") pod \"route-controller-manager-695f7b9b5-pqzw6\" (UID: \"101b40a6-d373-47b1-83f5-b5bf8bd579c8\") " pod="openshift-route-controller-manager/route-controller-manager-695f7b9b5-pqzw6" Feb 24 10:21:35 crc kubenswrapper[4698]: I0224 10:21:35.030064 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/101b40a6-d373-47b1-83f5-b5bf8bd579c8-client-ca\") pod \"route-controller-manager-695f7b9b5-pqzw6\" (UID: \"101b40a6-d373-47b1-83f5-b5bf8bd579c8\") " pod="openshift-route-controller-manager/route-controller-manager-695f7b9b5-pqzw6" Feb 24 10:21:35 crc kubenswrapper[4698]: I0224 10:21:35.032199 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/101b40a6-d373-47b1-83f5-b5bf8bd579c8-client-ca\") pod \"route-controller-manager-695f7b9b5-pqzw6\" (UID: \"101b40a6-d373-47b1-83f5-b5bf8bd579c8\") " pod="openshift-route-controller-manager/route-controller-manager-695f7b9b5-pqzw6" Feb 24 10:21:35 crc kubenswrapper[4698]: I0224 10:21:35.032469 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/101b40a6-d373-47b1-83f5-b5bf8bd579c8-config\") pod \"route-controller-manager-695f7b9b5-pqzw6\" (UID: \"101b40a6-d373-47b1-83f5-b5bf8bd579c8\") " pod="openshift-route-controller-manager/route-controller-manager-695f7b9b5-pqzw6" Feb 24 10:21:35 crc kubenswrapper[4698]: I0224 10:21:35.044031 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/101b40a6-d373-47b1-83f5-b5bf8bd579c8-serving-cert\") pod \"route-controller-manager-695f7b9b5-pqzw6\" (UID: \"101b40a6-d373-47b1-83f5-b5bf8bd579c8\") " pod="openshift-route-controller-manager/route-controller-manager-695f7b9b5-pqzw6" Feb 24 10:21:35 crc kubenswrapper[4698]: I0224 10:21:35.062431 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xsvgh\" (UniqueName: \"kubernetes.io/projected/101b40a6-d373-47b1-83f5-b5bf8bd579c8-kube-api-access-xsvgh\") pod \"route-controller-manager-695f7b9b5-pqzw6\" (UID: \"101b40a6-d373-47b1-83f5-b5bf8bd579c8\") " pod="openshift-route-controller-manager/route-controller-manager-695f7b9b5-pqzw6" Feb 24 10:21:35 crc kubenswrapper[4698]: I0224 10:21:35.186793 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-695f7b9b5-pqzw6" Feb 24 10:21:35 crc kubenswrapper[4698]: I0224 10:21:35.629322 4698 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ffa701b5-b0fd-46e8-b2dc-50daf6d2c30c" path="/var/lib/kubelet/pods/ffa701b5-b0fd-46e8-b2dc-50daf6d2c30c/volumes" Feb 24 10:21:39 crc kubenswrapper[4698]: I0224 10:21:39.509609 4698 scope.go:117] "RemoveContainer" containerID="5858cf29ef2ced6489a5acee1e8fd833c91d868928118e8675dd8157df0455d8" Feb 24 10:21:40 crc kubenswrapper[4698]: I0224 10:21:40.765832 4698 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 24 10:21:40 crc kubenswrapper[4698]: I0224 10:21:40.766951 4698 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://6b9d9ca2f4ccd094b55e3e27cef8afddae5dc7de81912aba64ca6a6671f14a35" gracePeriod=15 Feb 24 10:21:40 crc kubenswrapper[4698]: I0224 10:21:40.767203 4698 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://fa3d4a95fd60ff55d1850deb923135ed607172e7676a141a5d52e6cdd60b23bc" gracePeriod=15 Feb 24 10:21:40 crc kubenswrapper[4698]: I0224 10:21:40.767325 4698 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://674ed085a7507742c61fdb7dae4678b08e315a3679788c5dcbb4df97cdc27c61" gracePeriod=15 Feb 24 10:21:40 crc kubenswrapper[4698]: I0224 10:21:40.767394 4698 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://a42a2655047e1fb057b615781d8c2ccf50f62f2a70749ef8bb214d32edaba2b1" gracePeriod=15 Feb 24 10:21:40 crc kubenswrapper[4698]: I0224 10:21:40.767449 4698 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://7e1bb75600de7e41c8a04ba010078c753b55d05aae7a18f945c2027ba48ee30c" gracePeriod=15 Feb 24 10:21:40 crc kubenswrapper[4698]: I0224 10:21:40.768377 4698 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 24 10:21:40 crc kubenswrapper[4698]: E0224 10:21:40.768697 4698 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 24 10:21:40 crc kubenswrapper[4698]: I0224 10:21:40.768717 4698 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 24 10:21:40 crc kubenswrapper[4698]: E0224 10:21:40.768730 4698 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 24 10:21:40 crc kubenswrapper[4698]: I0224 10:21:40.768744 4698 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 24 10:21:40 crc kubenswrapper[4698]: E0224 10:21:40.768761 4698 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 24 10:21:40 crc kubenswrapper[4698]: I0224 10:21:40.768773 4698 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 24 10:21:40 crc kubenswrapper[4698]: E0224 10:21:40.768803 4698 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 24 10:21:40 crc kubenswrapper[4698]: I0224 10:21:40.768816 4698 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 24 10:21:40 crc kubenswrapper[4698]: E0224 10:21:40.768830 4698 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 24 10:21:40 crc kubenswrapper[4698]: I0224 10:21:40.768842 4698 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 24 10:21:40 crc kubenswrapper[4698]: E0224 10:21:40.768856 4698 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 24 10:21:40 crc kubenswrapper[4698]: I0224 10:21:40.768867 4698 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 24 10:21:40 crc kubenswrapper[4698]: E0224 10:21:40.768890 4698 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 24 10:21:40 crc kubenswrapper[4698]: I0224 10:21:40.768902 4698 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 24 10:21:40 crc kubenswrapper[4698]: E0224 10:21:40.768915 4698 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 24 10:21:40 crc kubenswrapper[4698]: I0224 10:21:40.768926 4698 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 24 10:21:40 crc kubenswrapper[4698]: I0224 10:21:40.769091 4698 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 24 10:21:40 crc kubenswrapper[4698]: I0224 10:21:40.769112 4698 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 24 10:21:40 crc kubenswrapper[4698]: I0224 10:21:40.769132 4698 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 24 10:21:40 crc kubenswrapper[4698]: I0224 10:21:40.769145 4698 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 24 10:21:40 crc kubenswrapper[4698]: I0224 10:21:40.769163 4698 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 24 10:21:40 crc kubenswrapper[4698]: I0224 10:21:40.769177 4698 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 24 10:21:40 crc kubenswrapper[4698]: I0224 10:21:40.769195 4698 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 24 10:21:40 crc kubenswrapper[4698]: E0224 10:21:40.769418 4698 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 24 10:21:40 crc kubenswrapper[4698]: I0224 10:21:40.769434 4698 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 24 10:21:40 crc kubenswrapper[4698]: E0224 10:21:40.769451 4698 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 24 10:21:40 crc kubenswrapper[4698]: I0224 10:21:40.769463 4698 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 24 10:21:40 crc kubenswrapper[4698]: I0224 10:21:40.769645 4698 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 24 10:21:40 crc kubenswrapper[4698]: I0224 10:21:40.769666 4698 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 24 10:21:40 crc kubenswrapper[4698]: I0224 10:21:40.771779 4698 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 24 10:21:40 crc kubenswrapper[4698]: I0224 10:21:40.772619 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 24 10:21:40 crc kubenswrapper[4698]: I0224 10:21:40.776583 4698 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="f4b27818a5e8e43d0dc095d08835c792" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" Feb 24 10:21:40 crc kubenswrapper[4698]: I0224 10:21:40.816894 4698 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 24 10:21:40 crc kubenswrapper[4698]: I0224 10:21:40.911449 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 24 10:21:40 crc kubenswrapper[4698]: I0224 10:21:40.911525 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 24 10:21:40 crc kubenswrapper[4698]: I0224 10:21:40.911599 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 10:21:40 crc kubenswrapper[4698]: I0224 10:21:40.911627 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 10:21:40 crc kubenswrapper[4698]: I0224 10:21:40.911789 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 24 10:21:40 crc kubenswrapper[4698]: I0224 10:21:40.911833 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 10:21:40 crc kubenswrapper[4698]: I0224 10:21:40.911902 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 24 10:21:40 crc kubenswrapper[4698]: I0224 10:21:40.911936 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 24 10:21:41 crc kubenswrapper[4698]: I0224 10:21:41.012854 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 24 10:21:41 crc kubenswrapper[4698]: I0224 10:21:41.012919 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 24 10:21:41 crc kubenswrapper[4698]: I0224 10:21:41.012980 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 24 10:21:41 crc kubenswrapper[4698]: I0224 10:21:41.013004 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 24 10:21:41 crc kubenswrapper[4698]: I0224 10:21:41.013037 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 10:21:41 crc kubenswrapper[4698]: I0224 10:21:41.013057 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 10:21:41 crc kubenswrapper[4698]: I0224 10:21:41.013105 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 24 10:21:41 crc kubenswrapper[4698]: I0224 10:21:41.013124 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 10:21:41 crc kubenswrapper[4698]: I0224 10:21:41.013127 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 24 10:21:41 crc kubenswrapper[4698]: I0224 10:21:41.013195 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 10:21:41 crc kubenswrapper[4698]: I0224 10:21:41.013239 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 10:21:41 crc kubenswrapper[4698]: I0224 10:21:41.013291 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 10:21:41 crc kubenswrapper[4698]: I0224 10:21:41.013292 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 24 10:21:41 crc kubenswrapper[4698]: I0224 10:21:41.013320 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 24 10:21:41 crc kubenswrapper[4698]: I0224 10:21:41.013347 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 24 10:21:41 crc kubenswrapper[4698]: I0224 10:21:41.013367 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 24 10:21:41 crc kubenswrapper[4698]: I0224 10:21:41.113328 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 24 10:21:41 crc kubenswrapper[4698]: I0224 10:21:41.352165 4698 generic.go:334] "Generic (PLEG): container finished" podID="d03f18c4-57c2-4d45-9b43-0b4fbf8f4a41" containerID="a1f5c477918c58606fd4bbe2b3fe11db67e3179d6f4fbb24d7058ff653e47859" exitCode=0 Feb 24 10:21:41 crc kubenswrapper[4698]: I0224 10:21:41.352300 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"d03f18c4-57c2-4d45-9b43-0b4fbf8f4a41","Type":"ContainerDied","Data":"a1f5c477918c58606fd4bbe2b3fe11db67e3179d6f4fbb24d7058ff653e47859"} Feb 24 10:21:41 crc kubenswrapper[4698]: I0224 10:21:41.353436 4698 status_manager.go:851] "Failed to get status for pod" podUID="d03f18c4-57c2-4d45-9b43-0b4fbf8f4a41" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:41 crc kubenswrapper[4698]: I0224 10:21:41.354505 4698 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:41 crc kubenswrapper[4698]: I0224 10:21:41.355286 4698 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Feb 24 10:21:41 crc kubenswrapper[4698]: I0224 10:21:41.357506 4698 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 24 10:21:41 crc kubenswrapper[4698]: I0224 10:21:41.358640 4698 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="fa3d4a95fd60ff55d1850deb923135ed607172e7676a141a5d52e6cdd60b23bc" exitCode=0 Feb 24 10:21:41 crc kubenswrapper[4698]: I0224 10:21:41.358673 4698 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="674ed085a7507742c61fdb7dae4678b08e315a3679788c5dcbb4df97cdc27c61" exitCode=0 Feb 24 10:21:41 crc kubenswrapper[4698]: I0224 10:21:41.358692 4698 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a42a2655047e1fb057b615781d8c2ccf50f62f2a70749ef8bb214d32edaba2b1" exitCode=0 Feb 24 10:21:41 crc kubenswrapper[4698]: I0224 10:21:41.358708 4698 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="7e1bb75600de7e41c8a04ba010078c753b55d05aae7a18f945c2027ba48ee30c" exitCode=2 Feb 24 10:21:41 crc kubenswrapper[4698]: E0224 10:21:41.607939 4698 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:41 crc kubenswrapper[4698]: E0224 10:21:41.608642 4698 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:41 crc kubenswrapper[4698]: E0224 10:21:41.609414 4698 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:41 crc kubenswrapper[4698]: E0224 10:21:41.610097 4698 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:41 crc kubenswrapper[4698]: E0224 10:21:41.610621 4698 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:41 crc kubenswrapper[4698]: I0224 10:21:41.610667 4698 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Feb 24 10:21:41 crc kubenswrapper[4698]: E0224 10:21:41.611081 4698 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.65:6443: connect: connection refused" interval="200ms" Feb 24 10:21:41 crc kubenswrapper[4698]: E0224 10:21:41.811884 4698 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.65:6443: connect: connection refused" interval="400ms" Feb 24 10:21:42 crc kubenswrapper[4698]: E0224 10:21:42.213536 4698 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.65:6443: connect: connection refused" interval="800ms" Feb 24 10:21:43 crc kubenswrapper[4698]: E0224 10:21:43.015377 4698 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.65:6443: connect: connection refused" interval="1.6s" Feb 24 10:21:43 crc kubenswrapper[4698]: E0224 10:21:43.199025 4698 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/events\": dial tcp 38.102.83.65:6443: connect: connection refused" event="&Event{ObjectMeta:{redhat-operators-ppmk4.18972795e82dae79 openshift-marketplace 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-marketplace,Name:redhat-operators-ppmk4,UID:55418747-8c79-496a-9b89-68f9eaa3f01a,APIVersion:v1,ResourceVersion:28605,FieldPath:spec.containers{registry-server},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\" in 16.955s (16.955s including waiting). Image size: 907837715 bytes.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:21:43.192497785 +0000 UTC m=+328.306112026,LastTimestamp:2026-02-24 10:21:43.192497785 +0000 UTC m=+328.306112026,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:21:44 crc kubenswrapper[4698]: I0224 10:21:44.385744 4698 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Feb 24 10:21:44 crc kubenswrapper[4698]: I0224 10:21:44.387813 4698 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 24 10:21:44 crc kubenswrapper[4698]: I0224 10:21:44.388690 4698 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="6b9d9ca2f4ccd094b55e3e27cef8afddae5dc7de81912aba64ca6a6671f14a35" exitCode=0 Feb 24 10:21:44 crc kubenswrapper[4698]: E0224 10:21:44.616218 4698 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.65:6443: connect: connection refused" interval="3.2s" Feb 24 10:21:45 crc kubenswrapper[4698]: I0224 10:21:45.620036 4698 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:45 crc kubenswrapper[4698]: I0224 10:21:45.620459 4698 status_manager.go:851] "Failed to get status for pod" podUID="d03f18c4-57c2-4d45-9b43-0b4fbf8f4a41" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:45 crc kubenswrapper[4698]: I0224 10:21:45.881217 4698 scope.go:117] "RemoveContainer" containerID="64b39341e105fbe8aa9dc4c108f6ee8a2bff33568a205e32e639b8382ab2ccb2" Feb 24 10:21:46 crc kubenswrapper[4698]: I0224 10:21:46.013576 4698 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 24 10:21:46 crc kubenswrapper[4698]: I0224 10:21:46.014088 4698 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:46 crc kubenswrapper[4698]: I0224 10:21:46.014247 4698 status_manager.go:851] "Failed to get status for pod" podUID="d03f18c4-57c2-4d45-9b43-0b4fbf8f4a41" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:46 crc kubenswrapper[4698]: I0224 10:21:46.205737 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d03f18c4-57c2-4d45-9b43-0b4fbf8f4a41-kubelet-dir\") pod \"d03f18c4-57c2-4d45-9b43-0b4fbf8f4a41\" (UID: \"d03f18c4-57c2-4d45-9b43-0b4fbf8f4a41\") " Feb 24 10:21:46 crc kubenswrapper[4698]: I0224 10:21:46.205827 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d03f18c4-57c2-4d45-9b43-0b4fbf8f4a41-var-lock\") pod \"d03f18c4-57c2-4d45-9b43-0b4fbf8f4a41\" (UID: \"d03f18c4-57c2-4d45-9b43-0b4fbf8f4a41\") " Feb 24 10:21:46 crc kubenswrapper[4698]: I0224 10:21:46.205937 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d03f18c4-57c2-4d45-9b43-0b4fbf8f4a41-kube-api-access\") pod \"d03f18c4-57c2-4d45-9b43-0b4fbf8f4a41\" (UID: \"d03f18c4-57c2-4d45-9b43-0b4fbf8f4a41\") " Feb 24 10:21:46 crc kubenswrapper[4698]: I0224 10:21:46.209975 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d03f18c4-57c2-4d45-9b43-0b4fbf8f4a41-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "d03f18c4-57c2-4d45-9b43-0b4fbf8f4a41" (UID: "d03f18c4-57c2-4d45-9b43-0b4fbf8f4a41"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 10:21:46 crc kubenswrapper[4698]: I0224 10:21:46.210237 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d03f18c4-57c2-4d45-9b43-0b4fbf8f4a41-var-lock" (OuterVolumeSpecName: "var-lock") pod "d03f18c4-57c2-4d45-9b43-0b4fbf8f4a41" (UID: "d03f18c4-57c2-4d45-9b43-0b4fbf8f4a41"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 10:21:46 crc kubenswrapper[4698]: I0224 10:21:46.214601 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d03f18c4-57c2-4d45-9b43-0b4fbf8f4a41-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "d03f18c4-57c2-4d45-9b43-0b4fbf8f4a41" (UID: "d03f18c4-57c2-4d45-9b43-0b4fbf8f4a41"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:21:46 crc kubenswrapper[4698]: I0224 10:21:46.308349 4698 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d03f18c4-57c2-4d45-9b43-0b4fbf8f4a41-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 24 10:21:46 crc kubenswrapper[4698]: I0224 10:21:46.308394 4698 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d03f18c4-57c2-4d45-9b43-0b4fbf8f4a41-var-lock\") on node \"crc\" DevicePath \"\"" Feb 24 10:21:46 crc kubenswrapper[4698]: I0224 10:21:46.308409 4698 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d03f18c4-57c2-4d45-9b43-0b4fbf8f4a41-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 24 10:21:46 crc kubenswrapper[4698]: I0224 10:21:46.410672 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"d03f18c4-57c2-4d45-9b43-0b4fbf8f4a41","Type":"ContainerDied","Data":"b640f84e4c03d14b5ab00b92bbace7ae599546229964e6de8fa8e0075249fae1"} Feb 24 10:21:46 crc kubenswrapper[4698]: I0224 10:21:46.410722 4698 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b640f84e4c03d14b5ab00b92bbace7ae599546229964e6de8fa8e0075249fae1" Feb 24 10:21:46 crc kubenswrapper[4698]: I0224 10:21:46.410785 4698 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 24 10:21:46 crc kubenswrapper[4698]: I0224 10:21:46.435887 4698 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:46 crc kubenswrapper[4698]: I0224 10:21:46.436428 4698 status_manager.go:851] "Failed to get status for pod" podUID="d03f18c4-57c2-4d45-9b43-0b4fbf8f4a41" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:47 crc kubenswrapper[4698]: I0224 10:21:47.420548 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8hhkf" event={"ID":"2eee2a16-171b-402e-9549-3d14cb56cddc","Type":"ContainerStarted","Data":"bfb3f47307f92e49a3096d724083678c00dbda225ab4067de2614a5c8f4a0a9f"} Feb 24 10:21:47 crc kubenswrapper[4698]: I0224 10:21:47.424309 4698 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 24 10:21:47 crc kubenswrapper[4698]: I0224 10:21:47.426160 4698 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="09e0397ce0a499dbaad45b5acfc9f732f4ef4de58c4bf2507c8cc09487292bc5" Feb 24 10:21:47 crc kubenswrapper[4698]: I0224 10:21:47.429193 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l7zvp" event={"ID":"291ba94f-a9ac-4d5c-8476-221496078d80","Type":"ContainerStarted","Data":"65da67d67e3e604c84ef87c76aa8af86c17ca0c8c931acd6cf9fca161bdf1d56"} Feb 24 10:21:47 crc kubenswrapper[4698]: I0224 10:21:47.429546 4698 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 24 10:21:47 crc kubenswrapper[4698]: I0224 10:21:47.430394 4698 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 10:21:47 crc kubenswrapper[4698]: I0224 10:21:47.431061 4698 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:47 crc kubenswrapper[4698]: I0224 10:21:47.431537 4698 status_manager.go:851] "Failed to get status for pod" podUID="d03f18c4-57c2-4d45-9b43-0b4fbf8f4a41" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:47 crc kubenswrapper[4698]: I0224 10:21:47.431999 4698 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:47 crc kubenswrapper[4698]: I0224 10:21:47.547170 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 24 10:21:47 crc kubenswrapper[4698]: I0224 10:21:47.547301 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 24 10:21:47 crc kubenswrapper[4698]: I0224 10:21:47.547337 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 24 10:21:47 crc kubenswrapper[4698]: I0224 10:21:47.547392 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 10:21:47 crc kubenswrapper[4698]: I0224 10:21:47.547445 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 10:21:47 crc kubenswrapper[4698]: I0224 10:21:47.547566 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 10:21:47 crc kubenswrapper[4698]: I0224 10:21:47.547802 4698 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 24 10:21:47 crc kubenswrapper[4698]: I0224 10:21:47.547818 4698 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 24 10:21:47 crc kubenswrapper[4698]: I0224 10:21:47.547830 4698 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Feb 24 10:21:47 crc kubenswrapper[4698]: I0224 10:21:47.626614 4698 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Feb 24 10:21:47 crc kubenswrapper[4698]: E0224 10:21:47.817833 4698 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.65:6443: connect: connection refused" interval="6.4s" Feb 24 10:21:48 crc kubenswrapper[4698]: E0224 10:21:48.050934 4698 log.go:32] "RunPodSandbox from runtime service failed" err=< Feb 24 10:21:48 crc kubenswrapper[4698]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-85cb469ccf-nndrr_openshift-controller-manager_e266bb2f-40eb-4da2-9767-0a300c8dc27b_0(1d4ea375c3e513642b49dec706e0847d23792f0ba60b02c043331a61b3d21360): error adding pod openshift-controller-manager_controller-manager-85cb469ccf-nndrr to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"1d4ea375c3e513642b49dec706e0847d23792f0ba60b02c043331a61b3d21360" Netns:"/var/run/netns/246bdd8f-e2fe-45eb-b47d-a76e0d85a751" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-85cb469ccf-nndrr;K8S_POD_INFRA_CONTAINER_ID=1d4ea375c3e513642b49dec706e0847d23792f0ba60b02c043331a61b3d21360;K8S_POD_UID=e266bb2f-40eb-4da2-9767-0a300c8dc27b" Path:"" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-85cb469ccf-nndrr] networking: Multus: [openshift-controller-manager/controller-manager-85cb469ccf-nndrr/e266bb2f-40eb-4da2-9767-0a300c8dc27b]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod controller-manager-85cb469ccf-nndrr in out of cluster comm: SetNetworkStatus: failed to update the pod controller-manager-85cb469ccf-nndrr in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-85cb469ccf-nndrr?timeout=1m0s": dial tcp 38.102.83.65:6443: connect: connection refused Feb 24 10:21:48 crc kubenswrapper[4698]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 24 10:21:48 crc kubenswrapper[4698]: > Feb 24 10:21:48 crc kubenswrapper[4698]: E0224 10:21:48.051456 4698 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Feb 24 10:21:48 crc kubenswrapper[4698]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-85cb469ccf-nndrr_openshift-controller-manager_e266bb2f-40eb-4da2-9767-0a300c8dc27b_0(1d4ea375c3e513642b49dec706e0847d23792f0ba60b02c043331a61b3d21360): error adding pod openshift-controller-manager_controller-manager-85cb469ccf-nndrr to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"1d4ea375c3e513642b49dec706e0847d23792f0ba60b02c043331a61b3d21360" Netns:"/var/run/netns/246bdd8f-e2fe-45eb-b47d-a76e0d85a751" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-85cb469ccf-nndrr;K8S_POD_INFRA_CONTAINER_ID=1d4ea375c3e513642b49dec706e0847d23792f0ba60b02c043331a61b3d21360;K8S_POD_UID=e266bb2f-40eb-4da2-9767-0a300c8dc27b" Path:"" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-85cb469ccf-nndrr] networking: Multus: [openshift-controller-manager/controller-manager-85cb469ccf-nndrr/e266bb2f-40eb-4da2-9767-0a300c8dc27b]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod controller-manager-85cb469ccf-nndrr in out of cluster comm: SetNetworkStatus: failed to update the pod controller-manager-85cb469ccf-nndrr in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-85cb469ccf-nndrr?timeout=1m0s": dial tcp 38.102.83.65:6443: connect: connection refused Feb 24 10:21:48 crc kubenswrapper[4698]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 24 10:21:48 crc kubenswrapper[4698]: > pod="openshift-controller-manager/controller-manager-85cb469ccf-nndrr" Feb 24 10:21:48 crc kubenswrapper[4698]: E0224 10:21:48.051528 4698 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Feb 24 10:21:48 crc kubenswrapper[4698]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-85cb469ccf-nndrr_openshift-controller-manager_e266bb2f-40eb-4da2-9767-0a300c8dc27b_0(1d4ea375c3e513642b49dec706e0847d23792f0ba60b02c043331a61b3d21360): error adding pod openshift-controller-manager_controller-manager-85cb469ccf-nndrr to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"1d4ea375c3e513642b49dec706e0847d23792f0ba60b02c043331a61b3d21360" Netns:"/var/run/netns/246bdd8f-e2fe-45eb-b47d-a76e0d85a751" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-85cb469ccf-nndrr;K8S_POD_INFRA_CONTAINER_ID=1d4ea375c3e513642b49dec706e0847d23792f0ba60b02c043331a61b3d21360;K8S_POD_UID=e266bb2f-40eb-4da2-9767-0a300c8dc27b" Path:"" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-85cb469ccf-nndrr] networking: Multus: [openshift-controller-manager/controller-manager-85cb469ccf-nndrr/e266bb2f-40eb-4da2-9767-0a300c8dc27b]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod controller-manager-85cb469ccf-nndrr in out of cluster comm: SetNetworkStatus: failed to update the pod controller-manager-85cb469ccf-nndrr in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-85cb469ccf-nndrr?timeout=1m0s": dial tcp 38.102.83.65:6443: connect: connection refused Feb 24 10:21:48 crc kubenswrapper[4698]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 24 10:21:48 crc kubenswrapper[4698]: > pod="openshift-controller-manager/controller-manager-85cb469ccf-nndrr" Feb 24 10:21:48 crc kubenswrapper[4698]: E0224 10:21:48.051629 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"controller-manager-85cb469ccf-nndrr_openshift-controller-manager(e266bb2f-40eb-4da2-9767-0a300c8dc27b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"controller-manager-85cb469ccf-nndrr_openshift-controller-manager(e266bb2f-40eb-4da2-9767-0a300c8dc27b)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-85cb469ccf-nndrr_openshift-controller-manager_e266bb2f-40eb-4da2-9767-0a300c8dc27b_0(1d4ea375c3e513642b49dec706e0847d23792f0ba60b02c043331a61b3d21360): error adding pod openshift-controller-manager_controller-manager-85cb469ccf-nndrr to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"1d4ea375c3e513642b49dec706e0847d23792f0ba60b02c043331a61b3d21360\\\" Netns:\\\"/var/run/netns/246bdd8f-e2fe-45eb-b47d-a76e0d85a751\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-85cb469ccf-nndrr;K8S_POD_INFRA_CONTAINER_ID=1d4ea375c3e513642b49dec706e0847d23792f0ba60b02c043331a61b3d21360;K8S_POD_UID=e266bb2f-40eb-4da2-9767-0a300c8dc27b\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-85cb469ccf-nndrr] networking: Multus: [openshift-controller-manager/controller-manager-85cb469ccf-nndrr/e266bb2f-40eb-4da2-9767-0a300c8dc27b]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod controller-manager-85cb469ccf-nndrr in out of cluster comm: SetNetworkStatus: failed to update the pod controller-manager-85cb469ccf-nndrr in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-85cb469ccf-nndrr?timeout=1m0s\\\": dial tcp 38.102.83.65:6443: connect: connection refused\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-controller-manager/controller-manager-85cb469ccf-nndrr" podUID="e266bb2f-40eb-4da2-9767-0a300c8dc27b" Feb 24 10:21:48 crc kubenswrapper[4698]: E0224 10:21:48.336530 4698 log.go:32] "RunPodSandbox from runtime service failed" err=< Feb 24 10:21:48 crc kubenswrapper[4698]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_route-controller-manager-695f7b9b5-pqzw6_openshift-route-controller-manager_101b40a6-d373-47b1-83f5-b5bf8bd579c8_0(4efe29d389f90495a8a1683dc233d01ea789b47c5fc54b1dbbb7b8e01a262754): error adding pod openshift-route-controller-manager_route-controller-manager-695f7b9b5-pqzw6 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"4efe29d389f90495a8a1683dc233d01ea789b47c5fc54b1dbbb7b8e01a262754" Netns:"/var/run/netns/0a21f96d-552d-4054-bdc5-84f18d937dc4" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-route-controller-manager;K8S_POD_NAME=route-controller-manager-695f7b9b5-pqzw6;K8S_POD_INFRA_CONTAINER_ID=4efe29d389f90495a8a1683dc233d01ea789b47c5fc54b1dbbb7b8e01a262754;K8S_POD_UID=101b40a6-d373-47b1-83f5-b5bf8bd579c8" Path:"" ERRORED: error configuring pod [openshift-route-controller-manager/route-controller-manager-695f7b9b5-pqzw6] networking: Multus: [openshift-route-controller-manager/route-controller-manager-695f7b9b5-pqzw6/101b40a6-d373-47b1-83f5-b5bf8bd579c8]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod route-controller-manager-695f7b9b5-pqzw6 in out of cluster comm: SetNetworkStatus: failed to update the pod route-controller-manager-695f7b9b5-pqzw6 in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-695f7b9b5-pqzw6?timeout=1m0s": dial tcp 38.102.83.65:6443: connect: connection refused Feb 24 10:21:48 crc kubenswrapper[4698]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 24 10:21:48 crc kubenswrapper[4698]: > Feb 24 10:21:48 crc kubenswrapper[4698]: E0224 10:21:48.336594 4698 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Feb 24 10:21:48 crc kubenswrapper[4698]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_route-controller-manager-695f7b9b5-pqzw6_openshift-route-controller-manager_101b40a6-d373-47b1-83f5-b5bf8bd579c8_0(4efe29d389f90495a8a1683dc233d01ea789b47c5fc54b1dbbb7b8e01a262754): error adding pod openshift-route-controller-manager_route-controller-manager-695f7b9b5-pqzw6 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"4efe29d389f90495a8a1683dc233d01ea789b47c5fc54b1dbbb7b8e01a262754" Netns:"/var/run/netns/0a21f96d-552d-4054-bdc5-84f18d937dc4" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-route-controller-manager;K8S_POD_NAME=route-controller-manager-695f7b9b5-pqzw6;K8S_POD_INFRA_CONTAINER_ID=4efe29d389f90495a8a1683dc233d01ea789b47c5fc54b1dbbb7b8e01a262754;K8S_POD_UID=101b40a6-d373-47b1-83f5-b5bf8bd579c8" Path:"" ERRORED: error configuring pod [openshift-route-controller-manager/route-controller-manager-695f7b9b5-pqzw6] networking: Multus: [openshift-route-controller-manager/route-controller-manager-695f7b9b5-pqzw6/101b40a6-d373-47b1-83f5-b5bf8bd579c8]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod route-controller-manager-695f7b9b5-pqzw6 in out of cluster comm: SetNetworkStatus: failed to update the pod route-controller-manager-695f7b9b5-pqzw6 in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-695f7b9b5-pqzw6?timeout=1m0s": dial tcp 38.102.83.65:6443: connect: connection refused Feb 24 10:21:48 crc kubenswrapper[4698]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 24 10:21:48 crc kubenswrapper[4698]: > pod="openshift-route-controller-manager/route-controller-manager-695f7b9b5-pqzw6" Feb 24 10:21:48 crc kubenswrapper[4698]: E0224 10:21:48.336618 4698 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Feb 24 10:21:48 crc kubenswrapper[4698]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_route-controller-manager-695f7b9b5-pqzw6_openshift-route-controller-manager_101b40a6-d373-47b1-83f5-b5bf8bd579c8_0(4efe29d389f90495a8a1683dc233d01ea789b47c5fc54b1dbbb7b8e01a262754): error adding pod openshift-route-controller-manager_route-controller-manager-695f7b9b5-pqzw6 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"4efe29d389f90495a8a1683dc233d01ea789b47c5fc54b1dbbb7b8e01a262754" Netns:"/var/run/netns/0a21f96d-552d-4054-bdc5-84f18d937dc4" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-route-controller-manager;K8S_POD_NAME=route-controller-manager-695f7b9b5-pqzw6;K8S_POD_INFRA_CONTAINER_ID=4efe29d389f90495a8a1683dc233d01ea789b47c5fc54b1dbbb7b8e01a262754;K8S_POD_UID=101b40a6-d373-47b1-83f5-b5bf8bd579c8" Path:"" ERRORED: error configuring pod [openshift-route-controller-manager/route-controller-manager-695f7b9b5-pqzw6] networking: Multus: [openshift-route-controller-manager/route-controller-manager-695f7b9b5-pqzw6/101b40a6-d373-47b1-83f5-b5bf8bd579c8]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod route-controller-manager-695f7b9b5-pqzw6 in out of cluster comm: SetNetworkStatus: failed to update the pod route-controller-manager-695f7b9b5-pqzw6 in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-695f7b9b5-pqzw6?timeout=1m0s": dial tcp 38.102.83.65:6443: connect: connection refused Feb 24 10:21:48 crc kubenswrapper[4698]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 24 10:21:48 crc kubenswrapper[4698]: > pod="openshift-route-controller-manager/route-controller-manager-695f7b9b5-pqzw6" Feb 24 10:21:48 crc kubenswrapper[4698]: E0224 10:21:48.336683 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"route-controller-manager-695f7b9b5-pqzw6_openshift-route-controller-manager(101b40a6-d373-47b1-83f5-b5bf8bd579c8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"route-controller-manager-695f7b9b5-pqzw6_openshift-route-controller-manager(101b40a6-d373-47b1-83f5-b5bf8bd579c8)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_route-controller-manager-695f7b9b5-pqzw6_openshift-route-controller-manager_101b40a6-d373-47b1-83f5-b5bf8bd579c8_0(4efe29d389f90495a8a1683dc233d01ea789b47c5fc54b1dbbb7b8e01a262754): error adding pod openshift-route-controller-manager_route-controller-manager-695f7b9b5-pqzw6 to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"4efe29d389f90495a8a1683dc233d01ea789b47c5fc54b1dbbb7b8e01a262754\\\" Netns:\\\"/var/run/netns/0a21f96d-552d-4054-bdc5-84f18d937dc4\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-route-controller-manager;K8S_POD_NAME=route-controller-manager-695f7b9b5-pqzw6;K8S_POD_INFRA_CONTAINER_ID=4efe29d389f90495a8a1683dc233d01ea789b47c5fc54b1dbbb7b8e01a262754;K8S_POD_UID=101b40a6-d373-47b1-83f5-b5bf8bd579c8\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-route-controller-manager/route-controller-manager-695f7b9b5-pqzw6] networking: Multus: [openshift-route-controller-manager/route-controller-manager-695f7b9b5-pqzw6/101b40a6-d373-47b1-83f5-b5bf8bd579c8]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod route-controller-manager-695f7b9b5-pqzw6 in out of cluster comm: SetNetworkStatus: failed to update the pod route-controller-manager-695f7b9b5-pqzw6 in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-695f7b9b5-pqzw6?timeout=1m0s\\\": dial tcp 38.102.83.65:6443: connect: connection refused\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-route-controller-manager/route-controller-manager-695f7b9b5-pqzw6" podUID="101b40a6-d373-47b1-83f5-b5bf8bd579c8" Feb 24 10:21:48 crc kubenswrapper[4698]: I0224 10:21:48.435547 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"41a379785f748901cdbcf51f7f11554530408f6c20551dd9a4b041c452c3d51b"} Feb 24 10:21:48 crc kubenswrapper[4698]: I0224 10:21:48.439054 4698 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 24 10:21:48 crc kubenswrapper[4698]: I0224 10:21:48.440489 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-695f7b9b5-pqzw6" Feb 24 10:21:48 crc kubenswrapper[4698]: I0224 10:21:48.440556 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-85cb469ccf-nndrr" Feb 24 10:21:48 crc kubenswrapper[4698]: I0224 10:21:48.440571 4698 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 10:21:48 crc kubenswrapper[4698]: I0224 10:21:48.440913 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-695f7b9b5-pqzw6" Feb 24 10:21:48 crc kubenswrapper[4698]: I0224 10:21:48.440961 4698 status_manager.go:851] "Failed to get status for pod" podUID="2eee2a16-171b-402e-9549-3d14cb56cddc" pod="openshift-marketplace/community-operators-8hhkf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-8hhkf\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:48 crc kubenswrapper[4698]: I0224 10:21:48.441160 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-85cb469ccf-nndrr" Feb 24 10:21:48 crc kubenswrapper[4698]: I0224 10:21:48.441413 4698 status_manager.go:851] "Failed to get status for pod" podUID="d03f18c4-57c2-4d45-9b43-0b4fbf8f4a41" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:48 crc kubenswrapper[4698]: I0224 10:21:48.441830 4698 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:48 crc kubenswrapper[4698]: I0224 10:21:48.442036 4698 status_manager.go:851] "Failed to get status for pod" podUID="2eee2a16-171b-402e-9549-3d14cb56cddc" pod="openshift-marketplace/community-operators-8hhkf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-8hhkf\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:48 crc kubenswrapper[4698]: I0224 10:21:48.442185 4698 status_manager.go:851] "Failed to get status for pod" podUID="d03f18c4-57c2-4d45-9b43-0b4fbf8f4a41" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:48 crc kubenswrapper[4698]: I0224 10:21:48.442360 4698 status_manager.go:851] "Failed to get status for pod" podUID="291ba94f-a9ac-4d5c-8476-221496078d80" pod="openshift-marketplace/redhat-operators-l7zvp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-l7zvp\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:48 crc kubenswrapper[4698]: I0224 10:21:48.442507 4698 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:48 crc kubenswrapper[4698]: I0224 10:21:48.442654 4698 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:48 crc kubenswrapper[4698]: I0224 10:21:48.447350 4698 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:48 crc kubenswrapper[4698]: I0224 10:21:48.447825 4698 status_manager.go:851] "Failed to get status for pod" podUID="2eee2a16-171b-402e-9549-3d14cb56cddc" pod="openshift-marketplace/community-operators-8hhkf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-8hhkf\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:48 crc kubenswrapper[4698]: I0224 10:21:48.448180 4698 status_manager.go:851] "Failed to get status for pod" podUID="d03f18c4-57c2-4d45-9b43-0b4fbf8f4a41" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:48 crc kubenswrapper[4698]: I0224 10:21:48.448494 4698 status_manager.go:851] "Failed to get status for pod" podUID="291ba94f-a9ac-4d5c-8476-221496078d80" pod="openshift-marketplace/redhat-operators-l7zvp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-l7zvp\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:48 crc kubenswrapper[4698]: I0224 10:21:48.448768 4698 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:48 crc kubenswrapper[4698]: E0224 10:21:48.867157 4698 log.go:32] "RunPodSandbox from runtime service failed" err=< Feb 24 10:21:48 crc kubenswrapper[4698]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_route-controller-manager-695f7b9b5-pqzw6_openshift-route-controller-manager_101b40a6-d373-47b1-83f5-b5bf8bd579c8_0(159c8ada8c44bd83850f8885d89b99dfca302628cb597cddbfc6fd9bfd7e57c5): error adding pod openshift-route-controller-manager_route-controller-manager-695f7b9b5-pqzw6 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"159c8ada8c44bd83850f8885d89b99dfca302628cb597cddbfc6fd9bfd7e57c5" Netns:"/var/run/netns/9a62c205-2a19-45ad-bc54-a2fd4a234f37" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-route-controller-manager;K8S_POD_NAME=route-controller-manager-695f7b9b5-pqzw6;K8S_POD_INFRA_CONTAINER_ID=159c8ada8c44bd83850f8885d89b99dfca302628cb597cddbfc6fd9bfd7e57c5;K8S_POD_UID=101b40a6-d373-47b1-83f5-b5bf8bd579c8" Path:"" ERRORED: error configuring pod [openshift-route-controller-manager/route-controller-manager-695f7b9b5-pqzw6] networking: Multus: [openshift-route-controller-manager/route-controller-manager-695f7b9b5-pqzw6/101b40a6-d373-47b1-83f5-b5bf8bd579c8]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod route-controller-manager-695f7b9b5-pqzw6 in out of cluster comm: SetNetworkStatus: failed to update the pod route-controller-manager-695f7b9b5-pqzw6 in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-695f7b9b5-pqzw6?timeout=1m0s": dial tcp 38.102.83.65:6443: connect: connection refused Feb 24 10:21:48 crc kubenswrapper[4698]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 24 10:21:48 crc kubenswrapper[4698]: > Feb 24 10:21:48 crc kubenswrapper[4698]: E0224 10:21:48.867211 4698 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Feb 24 10:21:48 crc kubenswrapper[4698]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_route-controller-manager-695f7b9b5-pqzw6_openshift-route-controller-manager_101b40a6-d373-47b1-83f5-b5bf8bd579c8_0(159c8ada8c44bd83850f8885d89b99dfca302628cb597cddbfc6fd9bfd7e57c5): error adding pod openshift-route-controller-manager_route-controller-manager-695f7b9b5-pqzw6 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"159c8ada8c44bd83850f8885d89b99dfca302628cb597cddbfc6fd9bfd7e57c5" Netns:"/var/run/netns/9a62c205-2a19-45ad-bc54-a2fd4a234f37" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-route-controller-manager;K8S_POD_NAME=route-controller-manager-695f7b9b5-pqzw6;K8S_POD_INFRA_CONTAINER_ID=159c8ada8c44bd83850f8885d89b99dfca302628cb597cddbfc6fd9bfd7e57c5;K8S_POD_UID=101b40a6-d373-47b1-83f5-b5bf8bd579c8" Path:"" ERRORED: error configuring pod [openshift-route-controller-manager/route-controller-manager-695f7b9b5-pqzw6] networking: Multus: [openshift-route-controller-manager/route-controller-manager-695f7b9b5-pqzw6/101b40a6-d373-47b1-83f5-b5bf8bd579c8]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod route-controller-manager-695f7b9b5-pqzw6 in out of cluster comm: SetNetworkStatus: failed to update the pod route-controller-manager-695f7b9b5-pqzw6 in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-695f7b9b5-pqzw6?timeout=1m0s": dial tcp 38.102.83.65:6443: connect: connection refused Feb 24 10:21:48 crc kubenswrapper[4698]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 24 10:21:48 crc kubenswrapper[4698]: > pod="openshift-route-controller-manager/route-controller-manager-695f7b9b5-pqzw6" Feb 24 10:21:48 crc kubenswrapper[4698]: E0224 10:21:48.867232 4698 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Feb 24 10:21:48 crc kubenswrapper[4698]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_route-controller-manager-695f7b9b5-pqzw6_openshift-route-controller-manager_101b40a6-d373-47b1-83f5-b5bf8bd579c8_0(159c8ada8c44bd83850f8885d89b99dfca302628cb597cddbfc6fd9bfd7e57c5): error adding pod openshift-route-controller-manager_route-controller-manager-695f7b9b5-pqzw6 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"159c8ada8c44bd83850f8885d89b99dfca302628cb597cddbfc6fd9bfd7e57c5" Netns:"/var/run/netns/9a62c205-2a19-45ad-bc54-a2fd4a234f37" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-route-controller-manager;K8S_POD_NAME=route-controller-manager-695f7b9b5-pqzw6;K8S_POD_INFRA_CONTAINER_ID=159c8ada8c44bd83850f8885d89b99dfca302628cb597cddbfc6fd9bfd7e57c5;K8S_POD_UID=101b40a6-d373-47b1-83f5-b5bf8bd579c8" Path:"" ERRORED: error configuring pod [openshift-route-controller-manager/route-controller-manager-695f7b9b5-pqzw6] networking: Multus: [openshift-route-controller-manager/route-controller-manager-695f7b9b5-pqzw6/101b40a6-d373-47b1-83f5-b5bf8bd579c8]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod route-controller-manager-695f7b9b5-pqzw6 in out of cluster comm: SetNetworkStatus: failed to update the pod route-controller-manager-695f7b9b5-pqzw6 in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-695f7b9b5-pqzw6?timeout=1m0s": dial tcp 38.102.83.65:6443: connect: connection refused Feb 24 10:21:48 crc kubenswrapper[4698]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 24 10:21:48 crc kubenswrapper[4698]: > pod="openshift-route-controller-manager/route-controller-manager-695f7b9b5-pqzw6" Feb 24 10:21:48 crc kubenswrapper[4698]: E0224 10:21:48.867331 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"route-controller-manager-695f7b9b5-pqzw6_openshift-route-controller-manager(101b40a6-d373-47b1-83f5-b5bf8bd579c8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"route-controller-manager-695f7b9b5-pqzw6_openshift-route-controller-manager(101b40a6-d373-47b1-83f5-b5bf8bd579c8)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_route-controller-manager-695f7b9b5-pqzw6_openshift-route-controller-manager_101b40a6-d373-47b1-83f5-b5bf8bd579c8_0(159c8ada8c44bd83850f8885d89b99dfca302628cb597cddbfc6fd9bfd7e57c5): error adding pod openshift-route-controller-manager_route-controller-manager-695f7b9b5-pqzw6 to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"159c8ada8c44bd83850f8885d89b99dfca302628cb597cddbfc6fd9bfd7e57c5\\\" Netns:\\\"/var/run/netns/9a62c205-2a19-45ad-bc54-a2fd4a234f37\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-route-controller-manager;K8S_POD_NAME=route-controller-manager-695f7b9b5-pqzw6;K8S_POD_INFRA_CONTAINER_ID=159c8ada8c44bd83850f8885d89b99dfca302628cb597cddbfc6fd9bfd7e57c5;K8S_POD_UID=101b40a6-d373-47b1-83f5-b5bf8bd579c8\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-route-controller-manager/route-controller-manager-695f7b9b5-pqzw6] networking: Multus: [openshift-route-controller-manager/route-controller-manager-695f7b9b5-pqzw6/101b40a6-d373-47b1-83f5-b5bf8bd579c8]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod route-controller-manager-695f7b9b5-pqzw6 in out of cluster comm: SetNetworkStatus: failed to update the pod route-controller-manager-695f7b9b5-pqzw6 in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-695f7b9b5-pqzw6?timeout=1m0s\\\": dial tcp 38.102.83.65:6443: connect: connection refused\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-route-controller-manager/route-controller-manager-695f7b9b5-pqzw6" podUID="101b40a6-d373-47b1-83f5-b5bf8bd579c8" Feb 24 10:21:48 crc kubenswrapper[4698]: E0224 10:21:48.935764 4698 log.go:32] "RunPodSandbox from runtime service failed" err=< Feb 24 10:21:48 crc kubenswrapper[4698]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-85cb469ccf-nndrr_openshift-controller-manager_e266bb2f-40eb-4da2-9767-0a300c8dc27b_0(d9f96db76ab58ed368ce271d3de1b22b3e4dc39b35e62331df581a7a2049e0e5): error adding pod openshift-controller-manager_controller-manager-85cb469ccf-nndrr to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"d9f96db76ab58ed368ce271d3de1b22b3e4dc39b35e62331df581a7a2049e0e5" Netns:"/var/run/netns/e0087c64-c1b2-4ccc-9e94-ee6f5797c0d9" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-85cb469ccf-nndrr;K8S_POD_INFRA_CONTAINER_ID=d9f96db76ab58ed368ce271d3de1b22b3e4dc39b35e62331df581a7a2049e0e5;K8S_POD_UID=e266bb2f-40eb-4da2-9767-0a300c8dc27b" Path:"" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-85cb469ccf-nndrr] networking: Multus: [openshift-controller-manager/controller-manager-85cb469ccf-nndrr/e266bb2f-40eb-4da2-9767-0a300c8dc27b]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod controller-manager-85cb469ccf-nndrr in out of cluster comm: SetNetworkStatus: failed to update the pod controller-manager-85cb469ccf-nndrr in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-85cb469ccf-nndrr?timeout=1m0s": dial tcp 38.102.83.65:6443: connect: connection refused Feb 24 10:21:48 crc kubenswrapper[4698]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 24 10:21:48 crc kubenswrapper[4698]: > Feb 24 10:21:48 crc kubenswrapper[4698]: E0224 10:21:48.935856 4698 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Feb 24 10:21:48 crc kubenswrapper[4698]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-85cb469ccf-nndrr_openshift-controller-manager_e266bb2f-40eb-4da2-9767-0a300c8dc27b_0(d9f96db76ab58ed368ce271d3de1b22b3e4dc39b35e62331df581a7a2049e0e5): error adding pod openshift-controller-manager_controller-manager-85cb469ccf-nndrr to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"d9f96db76ab58ed368ce271d3de1b22b3e4dc39b35e62331df581a7a2049e0e5" Netns:"/var/run/netns/e0087c64-c1b2-4ccc-9e94-ee6f5797c0d9" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-85cb469ccf-nndrr;K8S_POD_INFRA_CONTAINER_ID=d9f96db76ab58ed368ce271d3de1b22b3e4dc39b35e62331df581a7a2049e0e5;K8S_POD_UID=e266bb2f-40eb-4da2-9767-0a300c8dc27b" Path:"" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-85cb469ccf-nndrr] networking: Multus: [openshift-controller-manager/controller-manager-85cb469ccf-nndrr/e266bb2f-40eb-4da2-9767-0a300c8dc27b]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod controller-manager-85cb469ccf-nndrr in out of cluster comm: SetNetworkStatus: failed to update the pod controller-manager-85cb469ccf-nndrr in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-85cb469ccf-nndrr?timeout=1m0s": dial tcp 38.102.83.65:6443: connect: connection refused Feb 24 10:21:48 crc kubenswrapper[4698]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 24 10:21:48 crc kubenswrapper[4698]: > pod="openshift-controller-manager/controller-manager-85cb469ccf-nndrr" Feb 24 10:21:48 crc kubenswrapper[4698]: E0224 10:21:48.935887 4698 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Feb 24 10:21:48 crc kubenswrapper[4698]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-85cb469ccf-nndrr_openshift-controller-manager_e266bb2f-40eb-4da2-9767-0a300c8dc27b_0(d9f96db76ab58ed368ce271d3de1b22b3e4dc39b35e62331df581a7a2049e0e5): error adding pod openshift-controller-manager_controller-manager-85cb469ccf-nndrr to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"d9f96db76ab58ed368ce271d3de1b22b3e4dc39b35e62331df581a7a2049e0e5" Netns:"/var/run/netns/e0087c64-c1b2-4ccc-9e94-ee6f5797c0d9" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-85cb469ccf-nndrr;K8S_POD_INFRA_CONTAINER_ID=d9f96db76ab58ed368ce271d3de1b22b3e4dc39b35e62331df581a7a2049e0e5;K8S_POD_UID=e266bb2f-40eb-4da2-9767-0a300c8dc27b" Path:"" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-85cb469ccf-nndrr] networking: Multus: [openshift-controller-manager/controller-manager-85cb469ccf-nndrr/e266bb2f-40eb-4da2-9767-0a300c8dc27b]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod controller-manager-85cb469ccf-nndrr in out of cluster comm: SetNetworkStatus: failed to update the pod controller-manager-85cb469ccf-nndrr in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-85cb469ccf-nndrr?timeout=1m0s": dial tcp 38.102.83.65:6443: connect: connection refused Feb 24 10:21:48 crc kubenswrapper[4698]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 24 10:21:48 crc kubenswrapper[4698]: > pod="openshift-controller-manager/controller-manager-85cb469ccf-nndrr" Feb 24 10:21:48 crc kubenswrapper[4698]: E0224 10:21:48.936016 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"controller-manager-85cb469ccf-nndrr_openshift-controller-manager(e266bb2f-40eb-4da2-9767-0a300c8dc27b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"controller-manager-85cb469ccf-nndrr_openshift-controller-manager(e266bb2f-40eb-4da2-9767-0a300c8dc27b)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-85cb469ccf-nndrr_openshift-controller-manager_e266bb2f-40eb-4da2-9767-0a300c8dc27b_0(d9f96db76ab58ed368ce271d3de1b22b3e4dc39b35e62331df581a7a2049e0e5): error adding pod openshift-controller-manager_controller-manager-85cb469ccf-nndrr to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"d9f96db76ab58ed368ce271d3de1b22b3e4dc39b35e62331df581a7a2049e0e5\\\" Netns:\\\"/var/run/netns/e0087c64-c1b2-4ccc-9e94-ee6f5797c0d9\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-85cb469ccf-nndrr;K8S_POD_INFRA_CONTAINER_ID=d9f96db76ab58ed368ce271d3de1b22b3e4dc39b35e62331df581a7a2049e0e5;K8S_POD_UID=e266bb2f-40eb-4da2-9767-0a300c8dc27b\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-85cb469ccf-nndrr] networking: Multus: [openshift-controller-manager/controller-manager-85cb469ccf-nndrr/e266bb2f-40eb-4da2-9767-0a300c8dc27b]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod controller-manager-85cb469ccf-nndrr in out of cluster comm: SetNetworkStatus: failed to update the pod controller-manager-85cb469ccf-nndrr in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-85cb469ccf-nndrr?timeout=1m0s\\\": dial tcp 38.102.83.65:6443: connect: connection refused\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-controller-manager/controller-manager-85cb469ccf-nndrr" podUID="e266bb2f-40eb-4da2-9767-0a300c8dc27b" Feb 24 10:21:49 crc kubenswrapper[4698]: I0224 10:21:49.450590 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"8d93246e265f6291bed0b4633ebbd0554c888cc7bada79470ade0d28ef35c2f3"} Feb 24 10:21:49 crc kubenswrapper[4698]: I0224 10:21:49.453789 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p5dwb" event={"ID":"19022af1-394c-4aab-9eb1-ffb0f566d0ac","Type":"ContainerStarted","Data":"7f21bc1060193fd3dc0a8ba603e960a96ad825da9a8bc7332c1b998c3f99b59b"} Feb 24 10:21:50 crc kubenswrapper[4698]: I0224 10:21:50.460004 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ppmk4" event={"ID":"55418747-8c79-496a-9b89-68f9eaa3f01a","Type":"ContainerStarted","Data":"3df0db11f4531eee7e3f2c5268ae565ca343226663e1165dc986a058e42f7c19"} Feb 24 10:21:50 crc kubenswrapper[4698]: I0224 10:21:50.461392 4698 status_manager.go:851] "Failed to get status for pod" podUID="55418747-8c79-496a-9b89-68f9eaa3f01a" pod="openshift-marketplace/redhat-operators-ppmk4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-ppmk4\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:50 crc kubenswrapper[4698]: I0224 10:21:50.461617 4698 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:50 crc kubenswrapper[4698]: I0224 10:21:50.461926 4698 status_manager.go:851] "Failed to get status for pod" podUID="d03f18c4-57c2-4d45-9b43-0b4fbf8f4a41" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:50 crc kubenswrapper[4698]: I0224 10:21:50.462361 4698 status_manager.go:851] "Failed to get status for pod" podUID="2eee2a16-171b-402e-9549-3d14cb56cddc" pod="openshift-marketplace/community-operators-8hhkf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-8hhkf\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:50 crc kubenswrapper[4698]: I0224 10:21:50.462653 4698 status_manager.go:851] "Failed to get status for pod" podUID="291ba94f-a9ac-4d5c-8476-221496078d80" pod="openshift-marketplace/redhat-operators-l7zvp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-l7zvp\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:50 crc kubenswrapper[4698]: I0224 10:21:50.463235 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dh74l" event={"ID":"6f1af873-5e8f-4f75-81c2-c9b26ee37f2a","Type":"ContainerStarted","Data":"25696aece8ceb278543b675b47f84459a425f66214e17eb1ed294c82dd80be5c"} Feb 24 10:21:50 crc kubenswrapper[4698]: I0224 10:21:50.464721 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2z8bk" event={"ID":"5149fd4f-19d7-4852-b09a-d9909b8231dd","Type":"ContainerStarted","Data":"12504475d7d48a69d790441d24d5db1b06a5fc56a42debad4e922a6ebf5b8f0c"} Feb 24 10:21:50 crc kubenswrapper[4698]: I0224 10:21:50.467307 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p9jbm" event={"ID":"25f4eaf1-6171-44dd-b225-be712a45ba1b","Type":"ContainerStarted","Data":"d27747ceecec130db4ae589810d43e4170d1292ad502837658213b005fb212a6"} Feb 24 10:21:50 crc kubenswrapper[4698]: I0224 10:21:50.467863 4698 status_manager.go:851] "Failed to get status for pod" podUID="55418747-8c79-496a-9b89-68f9eaa3f01a" pod="openshift-marketplace/redhat-operators-ppmk4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-ppmk4\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:50 crc kubenswrapper[4698]: I0224 10:21:50.468095 4698 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:50 crc kubenswrapper[4698]: I0224 10:21:50.468428 4698 status_manager.go:851] "Failed to get status for pod" podUID="2eee2a16-171b-402e-9549-3d14cb56cddc" pod="openshift-marketplace/community-operators-8hhkf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-8hhkf\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:50 crc kubenswrapper[4698]: I0224 10:21:50.470527 4698 status_manager.go:851] "Failed to get status for pod" podUID="d03f18c4-57c2-4d45-9b43-0b4fbf8f4a41" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:50 crc kubenswrapper[4698]: I0224 10:21:50.470795 4698 status_manager.go:851] "Failed to get status for pod" podUID="291ba94f-a9ac-4d5c-8476-221496078d80" pod="openshift-marketplace/redhat-operators-l7zvp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-l7zvp\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:50 crc kubenswrapper[4698]: I0224 10:21:50.471096 4698 status_manager.go:851] "Failed to get status for pod" podUID="55418747-8c79-496a-9b89-68f9eaa3f01a" pod="openshift-marketplace/redhat-operators-ppmk4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-ppmk4\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:50 crc kubenswrapper[4698]: I0224 10:21:50.471369 4698 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:50 crc kubenswrapper[4698]: I0224 10:21:50.471619 4698 status_manager.go:851] "Failed to get status for pod" podUID="2eee2a16-171b-402e-9549-3d14cb56cddc" pod="openshift-marketplace/community-operators-8hhkf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-8hhkf\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:50 crc kubenswrapper[4698]: I0224 10:21:50.471856 4698 status_manager.go:851] "Failed to get status for pod" podUID="d03f18c4-57c2-4d45-9b43-0b4fbf8f4a41" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:50 crc kubenswrapper[4698]: I0224 10:21:50.472095 4698 status_manager.go:851] "Failed to get status for pod" podUID="291ba94f-a9ac-4d5c-8476-221496078d80" pod="openshift-marketplace/redhat-operators-l7zvp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-l7zvp\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:50 crc kubenswrapper[4698]: I0224 10:21:50.475113 4698 status_manager.go:851] "Failed to get status for pod" podUID="19022af1-394c-4aab-9eb1-ffb0f566d0ac" pod="openshift-marketplace/certified-operators-p5dwb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-p5dwb\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:50 crc kubenswrapper[4698]: I0224 10:21:50.475363 4698 status_manager.go:851] "Failed to get status for pod" podUID="25f4eaf1-6171-44dd-b225-be712a45ba1b" pod="openshift-marketplace/redhat-marketplace-p9jbm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-p9jbm\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:50 crc kubenswrapper[4698]: E0224 10:21:50.607360 4698 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/events\": dial tcp 38.102.83.65:6443: connect: connection refused" event="&Event{ObjectMeta:{redhat-operators-ppmk4.18972795e82dae79 openshift-marketplace 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-marketplace,Name:redhat-operators-ppmk4,UID:55418747-8c79-496a-9b89-68f9eaa3f01a,APIVersion:v1,ResourceVersion:28605,FieldPath:spec.containers{registry-server},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\" in 16.955s (16.955s including waiting). Image size: 907837715 bytes.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-24 10:21:43.192497785 +0000 UTC m=+328.306112026,LastTimestamp:2026-02-24 10:21:43.192497785 +0000 UTC m=+328.306112026,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 24 10:21:50 crc kubenswrapper[4698]: I0224 10:21:50.940070 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8hhkf" Feb 24 10:21:50 crc kubenswrapper[4698]: I0224 10:21:50.940167 4698 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8hhkf" Feb 24 10:21:50 crc kubenswrapper[4698]: I0224 10:21:50.991327 4698 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8hhkf" Feb 24 10:21:50 crc kubenswrapper[4698]: I0224 10:21:50.991799 4698 status_manager.go:851] "Failed to get status for pod" podUID="55418747-8c79-496a-9b89-68f9eaa3f01a" pod="openshift-marketplace/redhat-operators-ppmk4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-ppmk4\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:50 crc kubenswrapper[4698]: I0224 10:21:50.992276 4698 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:50 crc kubenswrapper[4698]: I0224 10:21:50.992701 4698 status_manager.go:851] "Failed to get status for pod" podUID="2eee2a16-171b-402e-9549-3d14cb56cddc" pod="openshift-marketplace/community-operators-8hhkf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-8hhkf\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:50 crc kubenswrapper[4698]: I0224 10:21:50.993018 4698 status_manager.go:851] "Failed to get status for pod" podUID="d03f18c4-57c2-4d45-9b43-0b4fbf8f4a41" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:50 crc kubenswrapper[4698]: I0224 10:21:50.993358 4698 status_manager.go:851] "Failed to get status for pod" podUID="291ba94f-a9ac-4d5c-8476-221496078d80" pod="openshift-marketplace/redhat-operators-l7zvp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-l7zvp\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:50 crc kubenswrapper[4698]: I0224 10:21:50.993634 4698 status_manager.go:851] "Failed to get status for pod" podUID="19022af1-394c-4aab-9eb1-ffb0f566d0ac" pod="openshift-marketplace/certified-operators-p5dwb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-p5dwb\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:50 crc kubenswrapper[4698]: I0224 10:21:50.993910 4698 status_manager.go:851] "Failed to get status for pod" podUID="25f4eaf1-6171-44dd-b225-be712a45ba1b" pod="openshift-marketplace/redhat-marketplace-p9jbm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-p9jbm\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:51 crc kubenswrapper[4698]: I0224 10:21:51.150247 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-p5dwb" Feb 24 10:21:51 crc kubenswrapper[4698]: I0224 10:21:51.150313 4698 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-p5dwb" Feb 24 10:21:51 crc kubenswrapper[4698]: I0224 10:21:51.187828 4698 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-p5dwb" Feb 24 10:21:51 crc kubenswrapper[4698]: I0224 10:21:51.188361 4698 status_manager.go:851] "Failed to get status for pod" podUID="25f4eaf1-6171-44dd-b225-be712a45ba1b" pod="openshift-marketplace/redhat-marketplace-p9jbm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-p9jbm\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:51 crc kubenswrapper[4698]: I0224 10:21:51.188987 4698 status_manager.go:851] "Failed to get status for pod" podUID="55418747-8c79-496a-9b89-68f9eaa3f01a" pod="openshift-marketplace/redhat-operators-ppmk4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-ppmk4\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:51 crc kubenswrapper[4698]: I0224 10:21:51.189488 4698 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:51 crc kubenswrapper[4698]: I0224 10:21:51.189760 4698 status_manager.go:851] "Failed to get status for pod" podUID="2eee2a16-171b-402e-9549-3d14cb56cddc" pod="openshift-marketplace/community-operators-8hhkf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-8hhkf\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:51 crc kubenswrapper[4698]: I0224 10:21:51.190110 4698 status_manager.go:851] "Failed to get status for pod" podUID="d03f18c4-57c2-4d45-9b43-0b4fbf8f4a41" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:51 crc kubenswrapper[4698]: I0224 10:21:51.190641 4698 status_manager.go:851] "Failed to get status for pod" podUID="291ba94f-a9ac-4d5c-8476-221496078d80" pod="openshift-marketplace/redhat-operators-l7zvp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-l7zvp\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:51 crc kubenswrapper[4698]: I0224 10:21:51.191024 4698 status_manager.go:851] "Failed to get status for pod" podUID="19022af1-394c-4aab-9eb1-ffb0f566d0ac" pod="openshift-marketplace/certified-operators-p5dwb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-p5dwb\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:51 crc kubenswrapper[4698]: I0224 10:21:51.471908 4698 status_manager.go:851] "Failed to get status for pod" podUID="25f4eaf1-6171-44dd-b225-be712a45ba1b" pod="openshift-marketplace/redhat-marketplace-p9jbm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-p9jbm\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:51 crc kubenswrapper[4698]: I0224 10:21:51.472344 4698 status_manager.go:851] "Failed to get status for pod" podUID="55418747-8c79-496a-9b89-68f9eaa3f01a" pod="openshift-marketplace/redhat-operators-ppmk4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-ppmk4\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:51 crc kubenswrapper[4698]: I0224 10:21:51.472758 4698 status_manager.go:851] "Failed to get status for pod" podUID="6f1af873-5e8f-4f75-81c2-c9b26ee37f2a" pod="openshift-marketplace/redhat-marketplace-dh74l" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-dh74l\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:51 crc kubenswrapper[4698]: I0224 10:21:51.473054 4698 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:51 crc kubenswrapper[4698]: I0224 10:21:51.473295 4698 status_manager.go:851] "Failed to get status for pod" podUID="2eee2a16-171b-402e-9549-3d14cb56cddc" pod="openshift-marketplace/community-operators-8hhkf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-8hhkf\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:51 crc kubenswrapper[4698]: I0224 10:21:51.473576 4698 status_manager.go:851] "Failed to get status for pod" podUID="d03f18c4-57c2-4d45-9b43-0b4fbf8f4a41" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:51 crc kubenswrapper[4698]: I0224 10:21:51.473866 4698 status_manager.go:851] "Failed to get status for pod" podUID="291ba94f-a9ac-4d5c-8476-221496078d80" pod="openshift-marketplace/redhat-operators-l7zvp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-l7zvp\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:51 crc kubenswrapper[4698]: I0224 10:21:51.474237 4698 status_manager.go:851] "Failed to get status for pod" podUID="5149fd4f-19d7-4852-b09a-d9909b8231dd" pod="openshift-marketplace/community-operators-2z8bk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-2z8bk\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:51 crc kubenswrapper[4698]: I0224 10:21:51.474504 4698 status_manager.go:851] "Failed to get status for pod" podUID="19022af1-394c-4aab-9eb1-ffb0f566d0ac" pod="openshift-marketplace/certified-operators-p5dwb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-p5dwb\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:51 crc kubenswrapper[4698]: I0224 10:21:51.474898 4698 status_manager.go:851] "Failed to get status for pod" podUID="25f4eaf1-6171-44dd-b225-be712a45ba1b" pod="openshift-marketplace/redhat-marketplace-p9jbm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-p9jbm\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:51 crc kubenswrapper[4698]: I0224 10:21:51.475226 4698 status_manager.go:851] "Failed to get status for pod" podUID="55418747-8c79-496a-9b89-68f9eaa3f01a" pod="openshift-marketplace/redhat-operators-ppmk4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-ppmk4\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:51 crc kubenswrapper[4698]: I0224 10:21:51.475554 4698 status_manager.go:851] "Failed to get status for pod" podUID="6f1af873-5e8f-4f75-81c2-c9b26ee37f2a" pod="openshift-marketplace/redhat-marketplace-dh74l" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-dh74l\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:51 crc kubenswrapper[4698]: I0224 10:21:51.475827 4698 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:51 crc kubenswrapper[4698]: I0224 10:21:51.476116 4698 status_manager.go:851] "Failed to get status for pod" podUID="2eee2a16-171b-402e-9549-3d14cb56cddc" pod="openshift-marketplace/community-operators-8hhkf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-8hhkf\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:51 crc kubenswrapper[4698]: I0224 10:21:51.476426 4698 status_manager.go:851] "Failed to get status for pod" podUID="d03f18c4-57c2-4d45-9b43-0b4fbf8f4a41" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:51 crc kubenswrapper[4698]: I0224 10:21:51.476761 4698 status_manager.go:851] "Failed to get status for pod" podUID="291ba94f-a9ac-4d5c-8476-221496078d80" pod="openshift-marketplace/redhat-operators-l7zvp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-l7zvp\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:51 crc kubenswrapper[4698]: I0224 10:21:51.477018 4698 status_manager.go:851] "Failed to get status for pod" podUID="19022af1-394c-4aab-9eb1-ffb0f566d0ac" pod="openshift-marketplace/certified-operators-p5dwb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-p5dwb\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:51 crc kubenswrapper[4698]: I0224 10:21:51.477328 4698 status_manager.go:851] "Failed to get status for pod" podUID="5149fd4f-19d7-4852-b09a-d9909b8231dd" pod="openshift-marketplace/community-operators-2z8bk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-2z8bk\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:52 crc kubenswrapper[4698]: I0224 10:21:52.542758 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8hhkf" Feb 24 10:21:52 crc kubenswrapper[4698]: I0224 10:21:52.543327 4698 status_manager.go:851] "Failed to get status for pod" podUID="25f4eaf1-6171-44dd-b225-be712a45ba1b" pod="openshift-marketplace/redhat-marketplace-p9jbm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-p9jbm\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:52 crc kubenswrapper[4698]: I0224 10:21:52.544008 4698 status_manager.go:851] "Failed to get status for pod" podUID="55418747-8c79-496a-9b89-68f9eaa3f01a" pod="openshift-marketplace/redhat-operators-ppmk4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-ppmk4\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:52 crc kubenswrapper[4698]: I0224 10:21:52.544486 4698 status_manager.go:851] "Failed to get status for pod" podUID="6f1af873-5e8f-4f75-81c2-c9b26ee37f2a" pod="openshift-marketplace/redhat-marketplace-dh74l" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-dh74l\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:52 crc kubenswrapper[4698]: I0224 10:21:52.544881 4698 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:52 crc kubenswrapper[4698]: I0224 10:21:52.545302 4698 status_manager.go:851] "Failed to get status for pod" podUID="2eee2a16-171b-402e-9549-3d14cb56cddc" pod="openshift-marketplace/community-operators-8hhkf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-8hhkf\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:52 crc kubenswrapper[4698]: I0224 10:21:52.545566 4698 status_manager.go:851] "Failed to get status for pod" podUID="d03f18c4-57c2-4d45-9b43-0b4fbf8f4a41" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:52 crc kubenswrapper[4698]: I0224 10:21:52.545901 4698 status_manager.go:851] "Failed to get status for pod" podUID="291ba94f-a9ac-4d5c-8476-221496078d80" pod="openshift-marketplace/redhat-operators-l7zvp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-l7zvp\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:52 crc kubenswrapper[4698]: I0224 10:21:52.546292 4698 status_manager.go:851] "Failed to get status for pod" podUID="19022af1-394c-4aab-9eb1-ffb0f566d0ac" pod="openshift-marketplace/certified-operators-p5dwb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-p5dwb\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:52 crc kubenswrapper[4698]: I0224 10:21:52.546613 4698 status_manager.go:851] "Failed to get status for pod" podUID="5149fd4f-19d7-4852-b09a-d9909b8231dd" pod="openshift-marketplace/community-operators-2z8bk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-2z8bk\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:52 crc kubenswrapper[4698]: E0224 10:21:52.651144 4698 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:21:52Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:21:52Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:21:52Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-24T10:21:52Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:08cff7c9164822cf90c1ddc99284f5fd3c4efbfdf7ff5d2da94ff20f03d57215\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8665346de3cec5b1443fb1e3bf6389962210affa684e5c1b521ec342f56e0901\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1703852494},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:64855c5259e69c674eb5b88ad763422f805a885f2b73638f74d6b2da5124b080\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:d31cd96af91f4826629bd4aa2b5616017608c433bbd21001049e081d29dac71f\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1238974149},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:0083802437179c49d5ca0e2c8adf92ebf380c283aa7a75464b6fb682634a5d40\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:5a47319cf339a942bc3cac0945fa2e73a0d5d04c393e0e2586a5f512b2bbbaa8\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1210619591},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:5f71bcc487d7df08613e74beacceeb04611a69281fd285bfdd5474ce1f460426\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:768d59ce045a28b596ea926e729a0acbf94cbe921ea358110ef5e6174edabbe8\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1203292349},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:52 crc kubenswrapper[4698]: E0224 10:21:52.651527 4698 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:52 crc kubenswrapper[4698]: E0224 10:21:52.651675 4698 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:52 crc kubenswrapper[4698]: E0224 10:21:52.651819 4698 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:52 crc kubenswrapper[4698]: E0224 10:21:52.652029 4698 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:52 crc kubenswrapper[4698]: E0224 10:21:52.652051 4698 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 24 10:21:53 crc kubenswrapper[4698]: I0224 10:21:53.159539 4698 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-p9jbm" Feb 24 10:21:53 crc kubenswrapper[4698]: I0224 10:21:53.159606 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-p9jbm" Feb 24 10:21:53 crc kubenswrapper[4698]: I0224 10:21:53.227195 4698 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-p9jbm" Feb 24 10:21:53 crc kubenswrapper[4698]: I0224 10:21:53.228336 4698 status_manager.go:851] "Failed to get status for pod" podUID="25f4eaf1-6171-44dd-b225-be712a45ba1b" pod="openshift-marketplace/redhat-marketplace-p9jbm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-p9jbm\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:53 crc kubenswrapper[4698]: I0224 10:21:53.228997 4698 status_manager.go:851] "Failed to get status for pod" podUID="55418747-8c79-496a-9b89-68f9eaa3f01a" pod="openshift-marketplace/redhat-operators-ppmk4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-ppmk4\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:53 crc kubenswrapper[4698]: I0224 10:21:53.229688 4698 status_manager.go:851] "Failed to get status for pod" podUID="6f1af873-5e8f-4f75-81c2-c9b26ee37f2a" pod="openshift-marketplace/redhat-marketplace-dh74l" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-dh74l\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:53 crc kubenswrapper[4698]: I0224 10:21:53.230126 4698 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:53 crc kubenswrapper[4698]: I0224 10:21:53.230597 4698 status_manager.go:851] "Failed to get status for pod" podUID="2eee2a16-171b-402e-9549-3d14cb56cddc" pod="openshift-marketplace/community-operators-8hhkf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-8hhkf\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:53 crc kubenswrapper[4698]: I0224 10:21:53.230890 4698 status_manager.go:851] "Failed to get status for pod" podUID="d03f18c4-57c2-4d45-9b43-0b4fbf8f4a41" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:53 crc kubenswrapper[4698]: I0224 10:21:53.231798 4698 status_manager.go:851] "Failed to get status for pod" podUID="291ba94f-a9ac-4d5c-8476-221496078d80" pod="openshift-marketplace/redhat-operators-l7zvp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-l7zvp\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:53 crc kubenswrapper[4698]: I0224 10:21:53.232202 4698 status_manager.go:851] "Failed to get status for pod" podUID="19022af1-394c-4aab-9eb1-ffb0f566d0ac" pod="openshift-marketplace/certified-operators-p5dwb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-p5dwb\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:53 crc kubenswrapper[4698]: I0224 10:21:53.232545 4698 status_manager.go:851] "Failed to get status for pod" podUID="5149fd4f-19d7-4852-b09a-d9909b8231dd" pod="openshift-marketplace/community-operators-2z8bk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-2z8bk\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:53 crc kubenswrapper[4698]: I0224 10:21:53.613975 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 10:21:53 crc kubenswrapper[4698]: I0224 10:21:53.615175 4698 status_manager.go:851] "Failed to get status for pod" podUID="55418747-8c79-496a-9b89-68f9eaa3f01a" pod="openshift-marketplace/redhat-operators-ppmk4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-ppmk4\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:53 crc kubenswrapper[4698]: I0224 10:21:53.615838 4698 status_manager.go:851] "Failed to get status for pod" podUID="6f1af873-5e8f-4f75-81c2-c9b26ee37f2a" pod="openshift-marketplace/redhat-marketplace-dh74l" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-dh74l\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:53 crc kubenswrapper[4698]: I0224 10:21:53.616189 4698 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:53 crc kubenswrapper[4698]: I0224 10:21:53.616804 4698 status_manager.go:851] "Failed to get status for pod" podUID="2eee2a16-171b-402e-9549-3d14cb56cddc" pod="openshift-marketplace/community-operators-8hhkf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-8hhkf\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:53 crc kubenswrapper[4698]: I0224 10:21:53.617514 4698 status_manager.go:851] "Failed to get status for pod" podUID="d03f18c4-57c2-4d45-9b43-0b4fbf8f4a41" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:53 crc kubenswrapper[4698]: I0224 10:21:53.618014 4698 status_manager.go:851] "Failed to get status for pod" podUID="291ba94f-a9ac-4d5c-8476-221496078d80" pod="openshift-marketplace/redhat-operators-l7zvp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-l7zvp\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:53 crc kubenswrapper[4698]: I0224 10:21:53.618502 4698 status_manager.go:851] "Failed to get status for pod" podUID="19022af1-394c-4aab-9eb1-ffb0f566d0ac" pod="openshift-marketplace/certified-operators-p5dwb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-p5dwb\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:53 crc kubenswrapper[4698]: I0224 10:21:53.618924 4698 status_manager.go:851] "Failed to get status for pod" podUID="5149fd4f-19d7-4852-b09a-d9909b8231dd" pod="openshift-marketplace/community-operators-2z8bk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-2z8bk\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:53 crc kubenswrapper[4698]: I0224 10:21:53.619223 4698 status_manager.go:851] "Failed to get status for pod" podUID="25f4eaf1-6171-44dd-b225-be712a45ba1b" pod="openshift-marketplace/redhat-marketplace-p9jbm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-p9jbm\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:53 crc kubenswrapper[4698]: I0224 10:21:53.643947 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-dh74l" Feb 24 10:21:53 crc kubenswrapper[4698]: I0224 10:21:53.644372 4698 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-dh74l" Feb 24 10:21:53 crc kubenswrapper[4698]: I0224 10:21:53.646398 4698 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="34fd32d5-5aed-4abb-bf14-ab1b8b02b516" Feb 24 10:21:53 crc kubenswrapper[4698]: I0224 10:21:53.646446 4698 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="34fd32d5-5aed-4abb-bf14-ab1b8b02b516" Feb 24 10:21:53 crc kubenswrapper[4698]: E0224 10:21:53.647135 4698 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 10:21:53 crc kubenswrapper[4698]: I0224 10:21:53.647828 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 10:21:53 crc kubenswrapper[4698]: I0224 10:21:53.707115 4698 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-dh74l" Feb 24 10:21:53 crc kubenswrapper[4698]: I0224 10:21:53.708468 4698 status_manager.go:851] "Failed to get status for pod" podUID="55418747-8c79-496a-9b89-68f9eaa3f01a" pod="openshift-marketplace/redhat-operators-ppmk4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-ppmk4\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:53 crc kubenswrapper[4698]: I0224 10:21:53.708825 4698 status_manager.go:851] "Failed to get status for pod" podUID="6f1af873-5e8f-4f75-81c2-c9b26ee37f2a" pod="openshift-marketplace/redhat-marketplace-dh74l" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-dh74l\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:53 crc kubenswrapper[4698]: I0224 10:21:53.709145 4698 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:53 crc kubenswrapper[4698]: I0224 10:21:53.709436 4698 status_manager.go:851] "Failed to get status for pod" podUID="d03f18c4-57c2-4d45-9b43-0b4fbf8f4a41" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:53 crc kubenswrapper[4698]: I0224 10:21:53.709910 4698 status_manager.go:851] "Failed to get status for pod" podUID="2eee2a16-171b-402e-9549-3d14cb56cddc" pod="openshift-marketplace/community-operators-8hhkf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-8hhkf\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:53 crc kubenswrapper[4698]: I0224 10:21:53.710513 4698 status_manager.go:851] "Failed to get status for pod" podUID="291ba94f-a9ac-4d5c-8476-221496078d80" pod="openshift-marketplace/redhat-operators-l7zvp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-l7zvp\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:53 crc kubenswrapper[4698]: I0224 10:21:53.710991 4698 status_manager.go:851] "Failed to get status for pod" podUID="19022af1-394c-4aab-9eb1-ffb0f566d0ac" pod="openshift-marketplace/certified-operators-p5dwb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-p5dwb\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:53 crc kubenswrapper[4698]: I0224 10:21:53.712032 4698 status_manager.go:851] "Failed to get status for pod" podUID="5149fd4f-19d7-4852-b09a-d9909b8231dd" pod="openshift-marketplace/community-operators-2z8bk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-2z8bk\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:53 crc kubenswrapper[4698]: I0224 10:21:53.712417 4698 status_manager.go:851] "Failed to get status for pod" podUID="25f4eaf1-6171-44dd-b225-be712a45ba1b" pod="openshift-marketplace/redhat-marketplace-p9jbm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-p9jbm\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:54 crc kubenswrapper[4698]: I0224 10:21:54.171012 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ppmk4" Feb 24 10:21:54 crc kubenswrapper[4698]: I0224 10:21:54.171126 4698 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ppmk4" Feb 24 10:21:54 crc kubenswrapper[4698]: E0224 10:21:54.219698 4698 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.65:6443: connect: connection refused" interval="7s" Feb 24 10:21:54 crc kubenswrapper[4698]: I0224 10:21:54.492406 4698 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="56e95dd119138d48f753e382d338a025cdaaa9a00d6b1858feefd5608594748a" exitCode=0 Feb 24 10:21:54 crc kubenswrapper[4698]: I0224 10:21:54.492535 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"56e95dd119138d48f753e382d338a025cdaaa9a00d6b1858feefd5608594748a"} Feb 24 10:21:54 crc kubenswrapper[4698]: I0224 10:21:54.492595 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"c618dc58deb0fe8e1a1d092643990cb8a8fabf0eab34f876fd8f919686a1579a"} Feb 24 10:21:54 crc kubenswrapper[4698]: I0224 10:21:54.493989 4698 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="34fd32d5-5aed-4abb-bf14-ab1b8b02b516" Feb 24 10:21:54 crc kubenswrapper[4698]: I0224 10:21:54.494029 4698 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="34fd32d5-5aed-4abb-bf14-ab1b8b02b516" Feb 24 10:21:54 crc kubenswrapper[4698]: E0224 10:21:54.494691 4698 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 10:21:54 crc kubenswrapper[4698]: I0224 10:21:54.494980 4698 status_manager.go:851] "Failed to get status for pod" podUID="55418747-8c79-496a-9b89-68f9eaa3f01a" pod="openshift-marketplace/redhat-operators-ppmk4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-ppmk4\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:54 crc kubenswrapper[4698]: I0224 10:21:54.495723 4698 status_manager.go:851] "Failed to get status for pod" podUID="6f1af873-5e8f-4f75-81c2-c9b26ee37f2a" pod="openshift-marketplace/redhat-marketplace-dh74l" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-dh74l\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:54 crc kubenswrapper[4698]: I0224 10:21:54.496387 4698 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:54 crc kubenswrapper[4698]: I0224 10:21:54.496874 4698 status_manager.go:851] "Failed to get status for pod" podUID="2eee2a16-171b-402e-9549-3d14cb56cddc" pod="openshift-marketplace/community-operators-8hhkf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-8hhkf\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:54 crc kubenswrapper[4698]: I0224 10:21:54.497357 4698 status_manager.go:851] "Failed to get status for pod" podUID="d03f18c4-57c2-4d45-9b43-0b4fbf8f4a41" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:54 crc kubenswrapper[4698]: I0224 10:21:54.497749 4698 status_manager.go:851] "Failed to get status for pod" podUID="291ba94f-a9ac-4d5c-8476-221496078d80" pod="openshift-marketplace/redhat-operators-l7zvp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-l7zvp\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:54 crc kubenswrapper[4698]: I0224 10:21:54.498088 4698 status_manager.go:851] "Failed to get status for pod" podUID="19022af1-394c-4aab-9eb1-ffb0f566d0ac" pod="openshift-marketplace/certified-operators-p5dwb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-p5dwb\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:54 crc kubenswrapper[4698]: I0224 10:21:54.498585 4698 status_manager.go:851] "Failed to get status for pod" podUID="5149fd4f-19d7-4852-b09a-d9909b8231dd" pod="openshift-marketplace/community-operators-2z8bk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-2z8bk\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:54 crc kubenswrapper[4698]: I0224 10:21:54.499063 4698 status_manager.go:851] "Failed to get status for pod" podUID="25f4eaf1-6171-44dd-b225-be712a45ba1b" pod="openshift-marketplace/redhat-marketplace-p9jbm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-p9jbm\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:54 crc kubenswrapper[4698]: I0224 10:21:54.563591 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-dh74l" Feb 24 10:21:54 crc kubenswrapper[4698]: I0224 10:21:54.564324 4698 status_manager.go:851] "Failed to get status for pod" podUID="25f4eaf1-6171-44dd-b225-be712a45ba1b" pod="openshift-marketplace/redhat-marketplace-p9jbm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-p9jbm\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:54 crc kubenswrapper[4698]: I0224 10:21:54.566225 4698 status_manager.go:851] "Failed to get status for pod" podUID="55418747-8c79-496a-9b89-68f9eaa3f01a" pod="openshift-marketplace/redhat-operators-ppmk4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-ppmk4\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:54 crc kubenswrapper[4698]: I0224 10:21:54.567441 4698 status_manager.go:851] "Failed to get status for pod" podUID="6f1af873-5e8f-4f75-81c2-c9b26ee37f2a" pod="openshift-marketplace/redhat-marketplace-dh74l" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-dh74l\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:54 crc kubenswrapper[4698]: I0224 10:21:54.568173 4698 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:54 crc kubenswrapper[4698]: I0224 10:21:54.568904 4698 status_manager.go:851] "Failed to get status for pod" podUID="2eee2a16-171b-402e-9549-3d14cb56cddc" pod="openshift-marketplace/community-operators-8hhkf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-8hhkf\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:54 crc kubenswrapper[4698]: I0224 10:21:54.569510 4698 status_manager.go:851] "Failed to get status for pod" podUID="d03f18c4-57c2-4d45-9b43-0b4fbf8f4a41" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:54 crc kubenswrapper[4698]: I0224 10:21:54.570054 4698 status_manager.go:851] "Failed to get status for pod" podUID="291ba94f-a9ac-4d5c-8476-221496078d80" pod="openshift-marketplace/redhat-operators-l7zvp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-l7zvp\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:54 crc kubenswrapper[4698]: I0224 10:21:54.570507 4698 status_manager.go:851] "Failed to get status for pod" podUID="19022af1-394c-4aab-9eb1-ffb0f566d0ac" pod="openshift-marketplace/certified-operators-p5dwb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-p5dwb\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:54 crc kubenswrapper[4698]: I0224 10:21:54.570939 4698 status_manager.go:851] "Failed to get status for pod" podUID="5149fd4f-19d7-4852-b09a-d9909b8231dd" pod="openshift-marketplace/community-operators-2z8bk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-2z8bk\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:54 crc kubenswrapper[4698]: I0224 10:21:54.599551 4698 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-l7zvp" Feb 24 10:21:54 crc kubenswrapper[4698]: I0224 10:21:54.599615 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-l7zvp" Feb 24 10:21:54 crc kubenswrapper[4698]: I0224 10:21:54.651153 4698 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-l7zvp" Feb 24 10:21:54 crc kubenswrapper[4698]: I0224 10:21:54.651751 4698 status_manager.go:851] "Failed to get status for pod" podUID="6f1af873-5e8f-4f75-81c2-c9b26ee37f2a" pod="openshift-marketplace/redhat-marketplace-dh74l" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-dh74l\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:54 crc kubenswrapper[4698]: I0224 10:21:54.652302 4698 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:54 crc kubenswrapper[4698]: I0224 10:21:54.652795 4698 status_manager.go:851] "Failed to get status for pod" podUID="2eee2a16-171b-402e-9549-3d14cb56cddc" pod="openshift-marketplace/community-operators-8hhkf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-8hhkf\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:54 crc kubenswrapper[4698]: I0224 10:21:54.653110 4698 status_manager.go:851] "Failed to get status for pod" podUID="d03f18c4-57c2-4d45-9b43-0b4fbf8f4a41" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:54 crc kubenswrapper[4698]: I0224 10:21:54.653455 4698 status_manager.go:851] "Failed to get status for pod" podUID="291ba94f-a9ac-4d5c-8476-221496078d80" pod="openshift-marketplace/redhat-operators-l7zvp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-l7zvp\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:54 crc kubenswrapper[4698]: I0224 10:21:54.653743 4698 status_manager.go:851] "Failed to get status for pod" podUID="19022af1-394c-4aab-9eb1-ffb0f566d0ac" pod="openshift-marketplace/certified-operators-p5dwb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-p5dwb\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:54 crc kubenswrapper[4698]: I0224 10:21:54.654380 4698 status_manager.go:851] "Failed to get status for pod" podUID="5149fd4f-19d7-4852-b09a-d9909b8231dd" pod="openshift-marketplace/community-operators-2z8bk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-2z8bk\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:54 crc kubenswrapper[4698]: I0224 10:21:54.654745 4698 status_manager.go:851] "Failed to get status for pod" podUID="25f4eaf1-6171-44dd-b225-be712a45ba1b" pod="openshift-marketplace/redhat-marketplace-p9jbm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-p9jbm\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:54 crc kubenswrapper[4698]: I0224 10:21:54.655018 4698 status_manager.go:851] "Failed to get status for pod" podUID="55418747-8c79-496a-9b89-68f9eaa3f01a" pod="openshift-marketplace/redhat-operators-ppmk4" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-ppmk4\": dial tcp 38.102.83.65:6443: connect: connection refused" Feb 24 10:21:55 crc kubenswrapper[4698]: I0224 10:21:55.248085 4698 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-ppmk4" podUID="55418747-8c79-496a-9b89-68f9eaa3f01a" containerName="registry-server" probeResult="failure" output=< Feb 24 10:21:55 crc kubenswrapper[4698]: timeout: failed to connect service ":50051" within 1s Feb 24 10:21:55 crc kubenswrapper[4698]: > Feb 24 10:21:55 crc kubenswrapper[4698]: I0224 10:21:55.500534 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"7ddedd44f7a8e9171def0e7825998240059fbf485023f5b6b576e4aaa288c6b0"} Feb 24 10:21:55 crc kubenswrapper[4698]: I0224 10:21:55.500583 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"52e04aabf802f5ae12b65e8b2e0bf6e70ef430bdb3ade989f0a504b262ac5dc4"} Feb 24 10:21:55 crc kubenswrapper[4698]: I0224 10:21:55.500597 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"11112569f99d26d36124f5f86c67db209d96dd1a6124e19b8bc545f5e9879673"} Feb 24 10:21:55 crc kubenswrapper[4698]: I0224 10:21:55.541091 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-l7zvp" Feb 24 10:21:56 crc kubenswrapper[4698]: I0224 10:21:56.163646 4698 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Readiness probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Feb 24 10:21:56 crc kubenswrapper[4698]: I0224 10:21:56.163975 4698 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Feb 24 10:21:56 crc kubenswrapper[4698]: I0224 10:21:56.508411 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"7ab75fc2a58e54b2fc52a02d9cbc72207b885f98cc34cc6e5f56ba4e841623bf"} Feb 24 10:21:56 crc kubenswrapper[4698]: I0224 10:21:56.508456 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"f7cadf840e794c5cd4405d2c1abfdc6a08296b1e26f4da6f19fd0511efc1b284"} Feb 24 10:21:56 crc kubenswrapper[4698]: I0224 10:21:56.508564 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 10:21:56 crc kubenswrapper[4698]: I0224 10:21:56.508629 4698 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="34fd32d5-5aed-4abb-bf14-ab1b8b02b516" Feb 24 10:21:56 crc kubenswrapper[4698]: I0224 10:21:56.508652 4698 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="34fd32d5-5aed-4abb-bf14-ab1b8b02b516" Feb 24 10:21:56 crc kubenswrapper[4698]: I0224 10:21:56.511076 4698 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Feb 24 10:21:56 crc kubenswrapper[4698]: I0224 10:21:56.511637 4698 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 24 10:21:56 crc kubenswrapper[4698]: I0224 10:21:56.511677 4698 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="e0a9191217045254bf454800fc32d325cc4450d0d4d0d9b6fb4bd6a438872cd9" exitCode=1 Feb 24 10:21:56 crc kubenswrapper[4698]: I0224 10:21:56.511802 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"e0a9191217045254bf454800fc32d325cc4450d0d4d0d9b6fb4bd6a438872cd9"} Feb 24 10:21:56 crc kubenswrapper[4698]: I0224 10:21:56.512237 4698 scope.go:117] "RemoveContainer" containerID="e0a9191217045254bf454800fc32d325cc4450d0d4d0d9b6fb4bd6a438872cd9" Feb 24 10:21:57 crc kubenswrapper[4698]: I0224 10:21:57.521872 4698 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Feb 24 10:21:57 crc kubenswrapper[4698]: I0224 10:21:57.522637 4698 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 24 10:21:57 crc kubenswrapper[4698]: I0224 10:21:57.522677 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ad130d0c31b31375e460ffddd5711a065eaaa0c06c4ef3d80a8bcf4702263046"} Feb 24 10:21:58 crc kubenswrapper[4698]: I0224 10:21:58.648381 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 10:21:58 crc kubenswrapper[4698]: I0224 10:21:58.648704 4698 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 10:21:58 crc kubenswrapper[4698]: I0224 10:21:58.655715 4698 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 10:22:01 crc kubenswrapper[4698]: I0224 10:22:01.217039 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-p5dwb" Feb 24 10:22:01 crc kubenswrapper[4698]: I0224 10:22:01.439564 4698 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-2z8bk" Feb 24 10:22:01 crc kubenswrapper[4698]: I0224 10:22:01.439620 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-2z8bk" Feb 24 10:22:01 crc kubenswrapper[4698]: I0224 10:22:01.482780 4698 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-2z8bk" Feb 24 10:22:01 crc kubenswrapper[4698]: I0224 10:22:01.520208 4698 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 10:22:01 crc kubenswrapper[4698]: I0224 10:22:01.546452 4698 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="34fd32d5-5aed-4abb-bf14-ab1b8b02b516" Feb 24 10:22:01 crc kubenswrapper[4698]: I0224 10:22:01.546501 4698 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="34fd32d5-5aed-4abb-bf14-ab1b8b02b516" Feb 24 10:22:01 crc kubenswrapper[4698]: I0224 10:22:01.551365 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 10:22:01 crc kubenswrapper[4698]: I0224 10:22:01.554216 4698 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="9e76a154-bd68-48a7-8766-30073fa319df" Feb 24 10:22:01 crc kubenswrapper[4698]: I0224 10:22:01.586083 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-2z8bk" Feb 24 10:22:02 crc kubenswrapper[4698]: I0224 10:22:02.551552 4698 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="34fd32d5-5aed-4abb-bf14-ab1b8b02b516" Feb 24 10:22:02 crc kubenswrapper[4698]: I0224 10:22:02.551587 4698 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="34fd32d5-5aed-4abb-bf14-ab1b8b02b516" Feb 24 10:22:02 crc kubenswrapper[4698]: I0224 10:22:02.653090 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-695f7b9b5-pqzw6" Feb 24 10:22:02 crc kubenswrapper[4698]: I0224 10:22:02.653756 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-695f7b9b5-pqzw6" Feb 24 10:22:03 crc kubenswrapper[4698]: W0224 10:22:03.106309 4698 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod101b40a6_d373_47b1_83f5_b5bf8bd579c8.slice/crio-93a4a205a6550ca196cdbfb82c857cb9872fe345e65955afd7705ae159354271 WatchSource:0}: Error finding container 93a4a205a6550ca196cdbfb82c857cb9872fe345e65955afd7705ae159354271: Status 404 returned error can't find the container with id 93a4a205a6550ca196cdbfb82c857cb9872fe345e65955afd7705ae159354271 Feb 24 10:22:03 crc kubenswrapper[4698]: I0224 10:22:03.226978 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-p9jbm" Feb 24 10:22:03 crc kubenswrapper[4698]: I0224 10:22:03.558803 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-695f7b9b5-pqzw6" event={"ID":"101b40a6-d373-47b1-83f5-b5bf8bd579c8","Type":"ContainerStarted","Data":"2c1907d8cbed72fda1c1b6eb1876b793cc232f9ae5d9a1a1d1146c8ac68bcc9e"} Feb 24 10:22:03 crc kubenswrapper[4698]: I0224 10:22:03.558852 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-695f7b9b5-pqzw6" event={"ID":"101b40a6-d373-47b1-83f5-b5bf8bd579c8","Type":"ContainerStarted","Data":"93a4a205a6550ca196cdbfb82c857cb9872fe345e65955afd7705ae159354271"} Feb 24 10:22:03 crc kubenswrapper[4698]: I0224 10:22:03.559281 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-695f7b9b5-pqzw6" Feb 24 10:22:03 crc kubenswrapper[4698]: I0224 10:22:03.614236 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-85cb469ccf-nndrr" Feb 24 10:22:03 crc kubenswrapper[4698]: I0224 10:22:03.614673 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-85cb469ccf-nndrr" Feb 24 10:22:03 crc kubenswrapper[4698]: W0224 10:22:03.870496 4698 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode266bb2f_40eb_4da2_9767_0a300c8dc27b.slice/crio-95c7b5dc8f8dad85fb79a0f538f0a865d9eb63f252754474cf9955cba0a04061 WatchSource:0}: Error finding container 95c7b5dc8f8dad85fb79a0f538f0a865d9eb63f252754474cf9955cba0a04061: Status 404 returned error can't find the container with id 95c7b5dc8f8dad85fb79a0f538f0a865d9eb63f252754474cf9955cba0a04061 Feb 24 10:22:04 crc kubenswrapper[4698]: I0224 10:22:04.213699 4698 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ppmk4" Feb 24 10:22:04 crc kubenswrapper[4698]: I0224 10:22:04.281459 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ppmk4" Feb 24 10:22:04 crc kubenswrapper[4698]: I0224 10:22:04.559497 4698 patch_prober.go:28] interesting pod/route-controller-manager-695f7b9b5-pqzw6 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.62:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 24 10:22:04 crc kubenswrapper[4698]: I0224 10:22:04.559604 4698 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-695f7b9b5-pqzw6" podUID="101b40a6-d373-47b1-83f5-b5bf8bd579c8" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.62:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 24 10:22:04 crc kubenswrapper[4698]: I0224 10:22:04.567871 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-85cb469ccf-nndrr" event={"ID":"e266bb2f-40eb-4da2-9767-0a300c8dc27b","Type":"ContainerStarted","Data":"ae84a11824584df4e5d291b39c126f5906f6363108f41233ce773636ee70284e"} Feb 24 10:22:04 crc kubenswrapper[4698]: I0224 10:22:04.567960 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-85cb469ccf-nndrr" event={"ID":"e266bb2f-40eb-4da2-9767-0a300c8dc27b","Type":"ContainerStarted","Data":"95c7b5dc8f8dad85fb79a0f538f0a865d9eb63f252754474cf9955cba0a04061"} Feb 24 10:22:04 crc kubenswrapper[4698]: I0224 10:22:04.568721 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-85cb469ccf-nndrr" Feb 24 10:22:04 crc kubenswrapper[4698]: I0224 10:22:04.575425 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-85cb469ccf-nndrr" Feb 24 10:22:05 crc kubenswrapper[4698]: I0224 10:22:05.106769 4698 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 24 10:22:05 crc kubenswrapper[4698]: I0224 10:22:05.106941 4698 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Feb 24 10:22:05 crc kubenswrapper[4698]: I0224 10:22:05.106983 4698 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Feb 24 10:22:05 crc kubenswrapper[4698]: I0224 10:22:05.568176 4698 patch_prober.go:28] interesting pod/route-controller-manager-695f7b9b5-pqzw6 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.62:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 24 10:22:05 crc kubenswrapper[4698]: I0224 10:22:05.568856 4698 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-695f7b9b5-pqzw6" podUID="101b40a6-d373-47b1-83f5-b5bf8bd579c8" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.62:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 24 10:22:05 crc kubenswrapper[4698]: I0224 10:22:05.670180 4698 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="9e76a154-bd68-48a7-8766-30073fa319df" Feb 24 10:22:06 crc kubenswrapper[4698]: I0224 10:22:06.163342 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 24 10:22:06 crc kubenswrapper[4698]: I0224 10:22:06.571364 4698 patch_prober.go:28] interesting pod/route-controller-manager-695f7b9b5-pqzw6 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.62:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 24 10:22:06 crc kubenswrapper[4698]: I0224 10:22:06.571450 4698 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-695f7b9b5-pqzw6" podUID="101b40a6-d373-47b1-83f5-b5bf8bd579c8" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.62:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 24 10:22:11 crc kubenswrapper[4698]: I0224 10:22:11.356031 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 24 10:22:11 crc kubenswrapper[4698]: I0224 10:22:11.357577 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 24 10:22:12 crc kubenswrapper[4698]: I0224 10:22:12.160832 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 24 10:22:12 crc kubenswrapper[4698]: I0224 10:22:12.189859 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 24 10:22:12 crc kubenswrapper[4698]: I0224 10:22:12.250772 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 24 10:22:12 crc kubenswrapper[4698]: I0224 10:22:12.308963 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 24 10:22:12 crc kubenswrapper[4698]: I0224 10:22:12.349200 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 24 10:22:12 crc kubenswrapper[4698]: I0224 10:22:12.386746 4698 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 24 10:22:12 crc kubenswrapper[4698]: I0224 10:22:12.500329 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 24 10:22:12 crc kubenswrapper[4698]: I0224 10:22:12.873772 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 24 10:22:12 crc kubenswrapper[4698]: I0224 10:22:12.958977 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 24 10:22:12 crc kubenswrapper[4698]: I0224 10:22:12.972005 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 24 10:22:13 crc kubenswrapper[4698]: I0224 10:22:13.265677 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 24 10:22:13 crc kubenswrapper[4698]: I0224 10:22:13.364540 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 24 10:22:13 crc kubenswrapper[4698]: I0224 10:22:13.566794 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 24 10:22:13 crc kubenswrapper[4698]: I0224 10:22:13.581796 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 24 10:22:13 crc kubenswrapper[4698]: I0224 10:22:13.617872 4698 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 24 10:22:13 crc kubenswrapper[4698]: I0224 10:22:13.618105 4698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-dh74l" podStartSLOduration=34.956616035 podStartE2EDuration="2m0.618049097s" podCreationTimestamp="2026-02-24 10:20:13 +0000 UTC" firstStartedPulling="2026-02-24 10:20:17.870375166 +0000 UTC m=+242.983989407" lastFinishedPulling="2026-02-24 10:21:43.531808198 +0000 UTC m=+328.645422469" observedRunningTime="2026-02-24 10:22:01.2756211 +0000 UTC m=+346.389235351" watchObservedRunningTime="2026-02-24 10:22:13.618049097 +0000 UTC m=+358.731663398" Feb 24 10:22:13 crc kubenswrapper[4698]: I0224 10:22:13.618488 4698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-l7zvp" podStartSLOduration=44.919293529 podStartE2EDuration="1m59.618468507s" podCreationTimestamp="2026-02-24 10:20:14 +0000 UTC" firstStartedPulling="2026-02-24 10:20:18.873325658 +0000 UTC m=+243.986939889" lastFinishedPulling="2026-02-24 10:21:33.572500586 +0000 UTC m=+318.686114867" observedRunningTime="2026-02-24 10:22:01.33224692 +0000 UTC m=+346.445861161" watchObservedRunningTime="2026-02-24 10:22:13.618468507 +0000 UTC m=+358.732082838" Feb 24 10:22:13 crc kubenswrapper[4698]: I0224 10:22:13.619075 4698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-p5dwb" podStartSLOduration=27.550423323 podStartE2EDuration="2m3.619057371s" podCreationTimestamp="2026-02-24 10:20:10 +0000 UTC" firstStartedPulling="2026-02-24 10:20:11.61657035 +0000 UTC m=+236.730184591" lastFinishedPulling="2026-02-24 10:21:47.685204358 +0000 UTC m=+332.798818639" observedRunningTime="2026-02-24 10:22:01.346005328 +0000 UTC m=+346.459619569" watchObservedRunningTime="2026-02-24 10:22:13.619057371 +0000 UTC m=+358.732671672" Feb 24 10:22:13 crc kubenswrapper[4698]: I0224 10:22:13.619940 4698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ppmk4" podStartSLOduration=36.300541049 podStartE2EDuration="2m0.619924072s" podCreationTimestamp="2026-02-24 10:20:13 +0000 UTC" firstStartedPulling="2026-02-24 10:20:18.873101262 +0000 UTC m=+243.986715503" lastFinishedPulling="2026-02-24 10:21:43.192484285 +0000 UTC m=+328.306098526" observedRunningTime="2026-02-24 10:22:01.250174884 +0000 UTC m=+346.363789145" watchObservedRunningTime="2026-02-24 10:22:13.619924072 +0000 UTC m=+358.733538353" Feb 24 10:22:13 crc kubenswrapper[4698]: I0224 10:22:13.621544 4698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-p9jbm" podStartSLOduration=30.524019999 podStartE2EDuration="2m1.62153274s" podCreationTimestamp="2026-02-24 10:20:12 +0000 UTC" firstStartedPulling="2026-02-24 10:20:14.791825011 +0000 UTC m=+239.905439252" lastFinishedPulling="2026-02-24 10:21:45.889337752 +0000 UTC m=+331.002951993" observedRunningTime="2026-02-24 10:22:01.372814716 +0000 UTC m=+346.486428957" watchObservedRunningTime="2026-02-24 10:22:13.62153274 +0000 UTC m=+358.735146981" Feb 24 10:22:13 crc kubenswrapper[4698]: I0224 10:22:13.623303 4698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-2z8bk" podStartSLOduration=29.41323094 podStartE2EDuration="2m2.623294882s" podCreationTimestamp="2026-02-24 10:20:11 +0000 UTC" firstStartedPulling="2026-02-24 10:20:12.679366432 +0000 UTC m=+237.792980673" lastFinishedPulling="2026-02-24 10:21:45.889430354 +0000 UTC m=+331.003044615" observedRunningTime="2026-02-24 10:22:01.358900304 +0000 UTC m=+346.472514565" watchObservedRunningTime="2026-02-24 10:22:13.623294882 +0000 UTC m=+358.736909123" Feb 24 10:22:13 crc kubenswrapper[4698]: I0224 10:22:13.623565 4698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8hhkf" podStartSLOduration=34.155777776 podStartE2EDuration="2m3.623558529s" podCreationTimestamp="2026-02-24 10:20:10 +0000 UTC" firstStartedPulling="2026-02-24 10:20:11.635306245 +0000 UTC m=+236.748920486" lastFinishedPulling="2026-02-24 10:21:41.103086988 +0000 UTC m=+326.216701239" observedRunningTime="2026-02-24 10:22:01.305239667 +0000 UTC m=+346.418853908" watchObservedRunningTime="2026-02-24 10:22:13.623558529 +0000 UTC m=+358.737172770" Feb 24 10:22:13 crc kubenswrapper[4698]: I0224 10:22:13.629412 4698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=33.629376987 podStartE2EDuration="33.629376987s" podCreationTimestamp="2026-02-24 10:21:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:22:01.285137377 +0000 UTC m=+346.398751618" watchObservedRunningTime="2026-02-24 10:22:13.629376987 +0000 UTC m=+358.742991318" Feb 24 10:22:13 crc kubenswrapper[4698]: I0224 10:22:13.631253 4698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-695f7b9b5-pqzw6" podStartSLOduration=44.631229261 podStartE2EDuration="44.631229261s" podCreationTimestamp="2026-02-24 10:21:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:22:03.572837499 +0000 UTC m=+348.686451780" watchObservedRunningTime="2026-02-24 10:22:13.631229261 +0000 UTC m=+358.744843502" Feb 24 10:22:13 crc kubenswrapper[4698]: I0224 10:22:13.631629 4698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-85cb469ccf-nndrr" podStartSLOduration=44.63162382 podStartE2EDuration="44.63162382s" podCreationTimestamp="2026-02-24 10:21:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:22:04.587650431 +0000 UTC m=+349.701264662" watchObservedRunningTime="2026-02-24 10:22:13.63162382 +0000 UTC m=+358.745238061" Feb 24 10:22:13 crc kubenswrapper[4698]: I0224 10:22:13.633428 4698 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 24 10:22:13 crc kubenswrapper[4698]: I0224 10:22:13.633469 4698 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 24 10:22:13 crc kubenswrapper[4698]: I0224 10:22:13.633487 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-695f7b9b5-pqzw6","openshift-controller-manager/controller-manager-85cb469ccf-nndrr"] Feb 24 10:22:13 crc kubenswrapper[4698]: I0224 10:22:13.655499 4698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=12.655483218 podStartE2EDuration="12.655483218s" podCreationTimestamp="2026-02-24 10:22:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:22:13.651486414 +0000 UTC m=+358.765100655" watchObservedRunningTime="2026-02-24 10:22:13.655483218 +0000 UTC m=+358.769097459" Feb 24 10:22:13 crc kubenswrapper[4698]: I0224 10:22:13.657116 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 24 10:22:13 crc kubenswrapper[4698]: I0224 10:22:13.807747 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 24 10:22:14 crc kubenswrapper[4698]: I0224 10:22:14.082985 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 24 10:22:14 crc kubenswrapper[4698]: I0224 10:22:14.223093 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 24 10:22:14 crc kubenswrapper[4698]: I0224 10:22:14.471886 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 24 10:22:14 crc kubenswrapper[4698]: I0224 10:22:14.634735 4698 patch_prober.go:28] interesting pod/route-controller-manager-695f7b9b5-pqzw6 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.62:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 24 10:22:14 crc kubenswrapper[4698]: I0224 10:22:14.634816 4698 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-695f7b9b5-pqzw6" podUID="101b40a6-d373-47b1-83f5-b5bf8bd579c8" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.62:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 24 10:22:14 crc kubenswrapper[4698]: I0224 10:22:14.664000 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 24 10:22:14 crc kubenswrapper[4698]: I0224 10:22:14.922222 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 24 10:22:15 crc kubenswrapper[4698]: I0224 10:22:15.107556 4698 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Feb 24 10:22:15 crc kubenswrapper[4698]: I0224 10:22:15.107633 4698 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Feb 24 10:22:15 crc kubenswrapper[4698]: I0224 10:22:15.513570 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 24 10:22:15 crc kubenswrapper[4698]: I0224 10:22:15.695433 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 24 10:22:15 crc kubenswrapper[4698]: I0224 10:22:15.708877 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 24 10:22:15 crc kubenswrapper[4698]: I0224 10:22:15.711691 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 24 10:22:15 crc kubenswrapper[4698]: I0224 10:22:15.744568 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 24 10:22:15 crc kubenswrapper[4698]: I0224 10:22:15.759335 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 24 10:22:15 crc kubenswrapper[4698]: I0224 10:22:15.882175 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 24 10:22:15 crc kubenswrapper[4698]: I0224 10:22:15.901422 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 24 10:22:15 crc kubenswrapper[4698]: I0224 10:22:15.929491 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 24 10:22:15 crc kubenswrapper[4698]: I0224 10:22:15.935650 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 24 10:22:15 crc kubenswrapper[4698]: I0224 10:22:15.972029 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 24 10:22:16 crc kubenswrapper[4698]: I0224 10:22:16.056385 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 24 10:22:16 crc kubenswrapper[4698]: I0224 10:22:16.092801 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 24 10:22:16 crc kubenswrapper[4698]: I0224 10:22:16.166361 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 24 10:22:16 crc kubenswrapper[4698]: I0224 10:22:16.175985 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 24 10:22:16 crc kubenswrapper[4698]: I0224 10:22:16.188152 4698 patch_prober.go:28] interesting pod/route-controller-manager-695f7b9b5-pqzw6 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.62:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 24 10:22:16 crc kubenswrapper[4698]: I0224 10:22:16.188208 4698 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-695f7b9b5-pqzw6" podUID="101b40a6-d373-47b1-83f5-b5bf8bd579c8" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.62:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 24 10:22:16 crc kubenswrapper[4698]: I0224 10:22:16.195550 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 24 10:22:16 crc kubenswrapper[4698]: I0224 10:22:16.248026 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 24 10:22:16 crc kubenswrapper[4698]: I0224 10:22:16.292320 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 24 10:22:16 crc kubenswrapper[4698]: I0224 10:22:16.335968 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 24 10:22:16 crc kubenswrapper[4698]: I0224 10:22:16.376720 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 24 10:22:16 crc kubenswrapper[4698]: I0224 10:22:16.471146 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 24 10:22:16 crc kubenswrapper[4698]: I0224 10:22:16.566519 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 24 10:22:16 crc kubenswrapper[4698]: I0224 10:22:16.735345 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 24 10:22:16 crc kubenswrapper[4698]: I0224 10:22:16.785613 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 24 10:22:16 crc kubenswrapper[4698]: I0224 10:22:16.869992 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 24 10:22:16 crc kubenswrapper[4698]: I0224 10:22:16.891223 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 24 10:22:16 crc kubenswrapper[4698]: I0224 10:22:16.982444 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 24 10:22:17 crc kubenswrapper[4698]: I0224 10:22:17.000281 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 24 10:22:17 crc kubenswrapper[4698]: I0224 10:22:17.052071 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 24 10:22:17 crc kubenswrapper[4698]: I0224 10:22:17.151318 4698 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 24 10:22:17 crc kubenswrapper[4698]: I0224 10:22:17.357250 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 24 10:22:17 crc kubenswrapper[4698]: I0224 10:22:17.516488 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 24 10:22:17 crc kubenswrapper[4698]: I0224 10:22:17.520044 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 24 10:22:17 crc kubenswrapper[4698]: I0224 10:22:17.548976 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 24 10:22:17 crc kubenswrapper[4698]: I0224 10:22:17.697907 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 24 10:22:17 crc kubenswrapper[4698]: I0224 10:22:17.730697 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 24 10:22:17 crc kubenswrapper[4698]: I0224 10:22:17.745811 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 24 10:22:17 crc kubenswrapper[4698]: I0224 10:22:17.776518 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 24 10:22:17 crc kubenswrapper[4698]: I0224 10:22:17.795922 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 24 10:22:17 crc kubenswrapper[4698]: I0224 10:22:17.867877 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 24 10:22:17 crc kubenswrapper[4698]: I0224 10:22:17.908062 4698 scope.go:117] "RemoveContainer" containerID="6b9d9ca2f4ccd094b55e3e27cef8afddae5dc7de81912aba64ca6a6671f14a35" Feb 24 10:22:17 crc kubenswrapper[4698]: I0224 10:22:17.912454 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 24 10:22:17 crc kubenswrapper[4698]: I0224 10:22:17.932349 4698 scope.go:117] "RemoveContainer" containerID="c9e1b116db9c76dec99d1ac4af98e5ee081f2a171a19093ba5628b676356f34b" Feb 24 10:22:17 crc kubenswrapper[4698]: I0224 10:22:17.959411 4698 scope.go:117] "RemoveContainer" containerID="7e1bb75600de7e41c8a04ba010078c753b55d05aae7a18f945c2027ba48ee30c" Feb 24 10:22:17 crc kubenswrapper[4698]: I0224 10:22:17.961347 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 24 10:22:17 crc kubenswrapper[4698]: I0224 10:22:17.985977 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 24 10:22:18 crc kubenswrapper[4698]: I0224 10:22:18.040630 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 24 10:22:18 crc kubenswrapper[4698]: I0224 10:22:18.128655 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 24 10:22:18 crc kubenswrapper[4698]: I0224 10:22:18.301978 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 24 10:22:18 crc kubenswrapper[4698]: I0224 10:22:18.305099 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 24 10:22:18 crc kubenswrapper[4698]: I0224 10:22:18.347500 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 24 10:22:18 crc kubenswrapper[4698]: I0224 10:22:18.486447 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 24 10:22:18 crc kubenswrapper[4698]: I0224 10:22:18.527738 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 24 10:22:18 crc kubenswrapper[4698]: I0224 10:22:18.542488 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 24 10:22:18 crc kubenswrapper[4698]: I0224 10:22:18.553398 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 24 10:22:18 crc kubenswrapper[4698]: I0224 10:22:18.564547 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 24 10:22:18 crc kubenswrapper[4698]: I0224 10:22:18.617391 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 24 10:22:18 crc kubenswrapper[4698]: I0224 10:22:18.661385 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 24 10:22:18 crc kubenswrapper[4698]: I0224 10:22:18.696043 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 24 10:22:18 crc kubenswrapper[4698]: I0224 10:22:18.697930 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 24 10:22:18 crc kubenswrapper[4698]: I0224 10:22:18.781224 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 24 10:22:18 crc kubenswrapper[4698]: I0224 10:22:18.789926 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 24 10:22:18 crc kubenswrapper[4698]: I0224 10:22:18.796592 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 24 10:22:18 crc kubenswrapper[4698]: I0224 10:22:18.809395 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 24 10:22:18 crc kubenswrapper[4698]: I0224 10:22:18.930577 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 24 10:22:18 crc kubenswrapper[4698]: I0224 10:22:18.970074 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 24 10:22:19 crc kubenswrapper[4698]: I0224 10:22:19.158719 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 24 10:22:19 crc kubenswrapper[4698]: I0224 10:22:19.172170 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 24 10:22:19 crc kubenswrapper[4698]: I0224 10:22:19.193792 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 24 10:22:19 crc kubenswrapper[4698]: I0224 10:22:19.257073 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 24 10:22:19 crc kubenswrapper[4698]: I0224 10:22:19.298391 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 24 10:22:19 crc kubenswrapper[4698]: I0224 10:22:19.300789 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 24 10:22:19 crc kubenswrapper[4698]: I0224 10:22:19.350913 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 24 10:22:19 crc kubenswrapper[4698]: I0224 10:22:19.372313 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 24 10:22:19 crc kubenswrapper[4698]: I0224 10:22:19.438495 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 24 10:22:19 crc kubenswrapper[4698]: I0224 10:22:19.441148 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 24 10:22:19 crc kubenswrapper[4698]: I0224 10:22:19.443979 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 24 10:22:19 crc kubenswrapper[4698]: I0224 10:22:19.536765 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 24 10:22:19 crc kubenswrapper[4698]: I0224 10:22:19.556094 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 24 10:22:19 crc kubenswrapper[4698]: I0224 10:22:19.593930 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 24 10:22:19 crc kubenswrapper[4698]: I0224 10:22:19.608534 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 24 10:22:19 crc kubenswrapper[4698]: I0224 10:22:19.622949 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 24 10:22:19 crc kubenswrapper[4698]: I0224 10:22:19.649821 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 24 10:22:19 crc kubenswrapper[4698]: I0224 10:22:19.684050 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 24 10:22:19 crc kubenswrapper[4698]: I0224 10:22:19.693045 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 24 10:22:19 crc kubenswrapper[4698]: I0224 10:22:19.776766 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 24 10:22:19 crc kubenswrapper[4698]: I0224 10:22:19.811314 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 24 10:22:19 crc kubenswrapper[4698]: I0224 10:22:19.859380 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 24 10:22:19 crc kubenswrapper[4698]: I0224 10:22:19.951859 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 24 10:22:19 crc kubenswrapper[4698]: I0224 10:22:19.954845 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 24 10:22:19 crc kubenswrapper[4698]: I0224 10:22:19.978196 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 24 10:22:19 crc kubenswrapper[4698]: I0224 10:22:19.978227 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 24 10:22:20 crc kubenswrapper[4698]: I0224 10:22:20.036378 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 24 10:22:20 crc kubenswrapper[4698]: I0224 10:22:20.102443 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 24 10:22:20 crc kubenswrapper[4698]: I0224 10:22:20.185519 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 24 10:22:20 crc kubenswrapper[4698]: I0224 10:22:20.238532 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 24 10:22:20 crc kubenswrapper[4698]: I0224 10:22:20.299408 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 24 10:22:20 crc kubenswrapper[4698]: I0224 10:22:20.334687 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 24 10:22:20 crc kubenswrapper[4698]: I0224 10:22:20.347867 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 24 10:22:20 crc kubenswrapper[4698]: I0224 10:22:20.348704 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 24 10:22:20 crc kubenswrapper[4698]: I0224 10:22:20.384078 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 24 10:22:20 crc kubenswrapper[4698]: I0224 10:22:20.399047 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 24 10:22:20 crc kubenswrapper[4698]: I0224 10:22:20.412217 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 24 10:22:20 crc kubenswrapper[4698]: I0224 10:22:20.440477 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 24 10:22:20 crc kubenswrapper[4698]: I0224 10:22:20.540666 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 24 10:22:20 crc kubenswrapper[4698]: I0224 10:22:20.599376 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 24 10:22:20 crc kubenswrapper[4698]: I0224 10:22:20.606077 4698 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 24 10:22:20 crc kubenswrapper[4698]: I0224 10:22:20.652344 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 24 10:22:20 crc kubenswrapper[4698]: I0224 10:22:20.663019 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 24 10:22:20 crc kubenswrapper[4698]: I0224 10:22:20.744254 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 24 10:22:20 crc kubenswrapper[4698]: I0224 10:22:20.764582 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 24 10:22:20 crc kubenswrapper[4698]: I0224 10:22:20.847669 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 24 10:22:21 crc kubenswrapper[4698]: I0224 10:22:21.002643 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 24 10:22:21 crc kubenswrapper[4698]: I0224 10:22:21.004860 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 24 10:22:21 crc kubenswrapper[4698]: I0224 10:22:21.070582 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 24 10:22:21 crc kubenswrapper[4698]: I0224 10:22:21.189666 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 24 10:22:21 crc kubenswrapper[4698]: I0224 10:22:21.194378 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 24 10:22:21 crc kubenswrapper[4698]: I0224 10:22:21.205354 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 24 10:22:21 crc kubenswrapper[4698]: I0224 10:22:21.209565 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 24 10:22:21 crc kubenswrapper[4698]: I0224 10:22:21.243606 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 24 10:22:21 crc kubenswrapper[4698]: I0224 10:22:21.361893 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 24 10:22:21 crc kubenswrapper[4698]: I0224 10:22:21.379164 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 24 10:22:21 crc kubenswrapper[4698]: I0224 10:22:21.388216 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 24 10:22:21 crc kubenswrapper[4698]: I0224 10:22:21.389870 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 24 10:22:21 crc kubenswrapper[4698]: I0224 10:22:21.398529 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 24 10:22:21 crc kubenswrapper[4698]: I0224 10:22:21.467666 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 24 10:22:21 crc kubenswrapper[4698]: I0224 10:22:21.484167 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 24 10:22:21 crc kubenswrapper[4698]: I0224 10:22:21.491052 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 24 10:22:21 crc kubenswrapper[4698]: I0224 10:22:21.518995 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 24 10:22:21 crc kubenswrapper[4698]: I0224 10:22:21.570095 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 24 10:22:21 crc kubenswrapper[4698]: I0224 10:22:21.577216 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 24 10:22:21 crc kubenswrapper[4698]: I0224 10:22:21.605517 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 24 10:22:21 crc kubenswrapper[4698]: I0224 10:22:21.765437 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 24 10:22:21 crc kubenswrapper[4698]: I0224 10:22:21.818598 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 24 10:22:21 crc kubenswrapper[4698]: I0224 10:22:21.895998 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 24 10:22:21 crc kubenswrapper[4698]: I0224 10:22:21.939541 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 24 10:22:22 crc kubenswrapper[4698]: I0224 10:22:22.031024 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 24 10:22:22 crc kubenswrapper[4698]: I0224 10:22:22.038114 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 24 10:22:22 crc kubenswrapper[4698]: I0224 10:22:22.126204 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 24 10:22:22 crc kubenswrapper[4698]: I0224 10:22:22.158630 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 24 10:22:22 crc kubenswrapper[4698]: I0224 10:22:22.270783 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 24 10:22:22 crc kubenswrapper[4698]: I0224 10:22:22.313103 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 24 10:22:22 crc kubenswrapper[4698]: I0224 10:22:22.320825 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 24 10:22:22 crc kubenswrapper[4698]: I0224 10:22:22.330149 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 24 10:22:22 crc kubenswrapper[4698]: I0224 10:22:22.396002 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 24 10:22:22 crc kubenswrapper[4698]: I0224 10:22:22.410229 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 24 10:22:22 crc kubenswrapper[4698]: I0224 10:22:22.464536 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 24 10:22:22 crc kubenswrapper[4698]: I0224 10:22:22.491950 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 24 10:22:22 crc kubenswrapper[4698]: I0224 10:22:22.547068 4698 ???:1] "http: TLS handshake error from 192.168.126.11:56496: no serving certificate available for the kubelet" Feb 24 10:22:22 crc kubenswrapper[4698]: I0224 10:22:22.576866 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 24 10:22:22 crc kubenswrapper[4698]: I0224 10:22:22.592988 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 24 10:22:22 crc kubenswrapper[4698]: I0224 10:22:22.757610 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 24 10:22:22 crc kubenswrapper[4698]: I0224 10:22:22.789410 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 24 10:22:22 crc kubenswrapper[4698]: I0224 10:22:22.802846 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 24 10:22:22 crc kubenswrapper[4698]: I0224 10:22:22.895116 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 24 10:22:22 crc kubenswrapper[4698]: I0224 10:22:22.940350 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 24 10:22:23 crc kubenswrapper[4698]: I0224 10:22:23.016383 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 24 10:22:23 crc kubenswrapper[4698]: I0224 10:22:23.065556 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 24 10:22:23 crc kubenswrapper[4698]: I0224 10:22:23.148018 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 24 10:22:23 crc kubenswrapper[4698]: I0224 10:22:23.254331 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 24 10:22:23 crc kubenswrapper[4698]: I0224 10:22:23.277416 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 24 10:22:23 crc kubenswrapper[4698]: I0224 10:22:23.308794 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 24 10:22:23 crc kubenswrapper[4698]: I0224 10:22:23.379073 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 24 10:22:23 crc kubenswrapper[4698]: I0224 10:22:23.434152 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 24 10:22:23 crc kubenswrapper[4698]: I0224 10:22:23.435631 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 24 10:22:23 crc kubenswrapper[4698]: I0224 10:22:23.436236 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 24 10:22:23 crc kubenswrapper[4698]: I0224 10:22:23.463350 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 24 10:22:23 crc kubenswrapper[4698]: I0224 10:22:23.504000 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 24 10:22:23 crc kubenswrapper[4698]: I0224 10:22:23.601595 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 24 10:22:23 crc kubenswrapper[4698]: I0224 10:22:23.898172 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 24 10:22:23 crc kubenswrapper[4698]: I0224 10:22:23.992695 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 24 10:22:24 crc kubenswrapper[4698]: I0224 10:22:24.008043 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 24 10:22:24 crc kubenswrapper[4698]: I0224 10:22:24.043725 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 24 10:22:24 crc kubenswrapper[4698]: I0224 10:22:24.045897 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 24 10:22:24 crc kubenswrapper[4698]: I0224 10:22:24.054826 4698 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 24 10:22:24 crc kubenswrapper[4698]: I0224 10:22:24.055045 4698 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://8d93246e265f6291bed0b4633ebbd0554c888cc7bada79470ade0d28ef35c2f3" gracePeriod=5 Feb 24 10:22:24 crc kubenswrapper[4698]: I0224 10:22:24.147164 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 24 10:22:24 crc kubenswrapper[4698]: I0224 10:22:24.192509 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 24 10:22:24 crc kubenswrapper[4698]: I0224 10:22:24.266251 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 24 10:22:24 crc kubenswrapper[4698]: I0224 10:22:24.284054 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 24 10:22:24 crc kubenswrapper[4698]: I0224 10:22:24.367762 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 24 10:22:24 crc kubenswrapper[4698]: I0224 10:22:24.478671 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 24 10:22:24 crc kubenswrapper[4698]: I0224 10:22:24.486711 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 24 10:22:24 crc kubenswrapper[4698]: I0224 10:22:24.646118 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 24 10:22:24 crc kubenswrapper[4698]: I0224 10:22:24.665852 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 24 10:22:24 crc kubenswrapper[4698]: I0224 10:22:24.713346 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 24 10:22:24 crc kubenswrapper[4698]: I0224 10:22:24.769121 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 24 10:22:24 crc kubenswrapper[4698]: I0224 10:22:24.772037 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 24 10:22:24 crc kubenswrapper[4698]: I0224 10:22:24.930085 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 24 10:22:24 crc kubenswrapper[4698]: I0224 10:22:24.987673 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 24 10:22:25 crc kubenswrapper[4698]: I0224 10:22:25.012897 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 24 10:22:25 crc kubenswrapper[4698]: I0224 10:22:25.107666 4698 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Feb 24 10:22:25 crc kubenswrapper[4698]: I0224 10:22:25.107755 4698 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Feb 24 10:22:25 crc kubenswrapper[4698]: I0224 10:22:25.107826 4698 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 24 10:22:25 crc kubenswrapper[4698]: I0224 10:22:25.108566 4698 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"ad130d0c31b31375e460ffddd5711a065eaaa0c06c4ef3d80a8bcf4702263046"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Feb 24 10:22:25 crc kubenswrapper[4698]: I0224 10:22:25.108713 4698 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" containerID="cri-o://ad130d0c31b31375e460ffddd5711a065eaaa0c06c4ef3d80a8bcf4702263046" gracePeriod=30 Feb 24 10:22:25 crc kubenswrapper[4698]: I0224 10:22:25.195528 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-695f7b9b5-pqzw6" Feb 24 10:22:25 crc kubenswrapper[4698]: I0224 10:22:25.283102 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 24 10:22:25 crc kubenswrapper[4698]: I0224 10:22:25.338037 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 24 10:22:25 crc kubenswrapper[4698]: I0224 10:22:25.448773 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 24 10:22:25 crc kubenswrapper[4698]: I0224 10:22:25.595973 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 24 10:22:25 crc kubenswrapper[4698]: I0224 10:22:25.716136 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 24 10:22:25 crc kubenswrapper[4698]: I0224 10:22:25.725531 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 24 10:22:25 crc kubenswrapper[4698]: I0224 10:22:25.778813 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 24 10:22:25 crc kubenswrapper[4698]: I0224 10:22:25.783779 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 24 10:22:25 crc kubenswrapper[4698]: I0224 10:22:25.819160 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 24 10:22:25 crc kubenswrapper[4698]: I0224 10:22:25.870293 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 24 10:22:25 crc kubenswrapper[4698]: I0224 10:22:25.938972 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 24 10:22:26 crc kubenswrapper[4698]: I0224 10:22:26.091658 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 24 10:22:26 crc kubenswrapper[4698]: I0224 10:22:26.123811 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 24 10:22:26 crc kubenswrapper[4698]: I0224 10:22:26.214029 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 24 10:22:26 crc kubenswrapper[4698]: I0224 10:22:26.232525 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 24 10:22:26 crc kubenswrapper[4698]: I0224 10:22:26.349893 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 24 10:22:26 crc kubenswrapper[4698]: I0224 10:22:26.402597 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 24 10:22:26 crc kubenswrapper[4698]: I0224 10:22:26.516366 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 24 10:22:26 crc kubenswrapper[4698]: I0224 10:22:26.530557 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 24 10:22:26 crc kubenswrapper[4698]: I0224 10:22:26.617058 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 24 10:22:26 crc kubenswrapper[4698]: I0224 10:22:26.675593 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 24 10:22:26 crc kubenswrapper[4698]: I0224 10:22:26.776871 4698 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 24 10:22:26 crc kubenswrapper[4698]: I0224 10:22:26.809063 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 24 10:22:26 crc kubenswrapper[4698]: I0224 10:22:26.844051 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 24 10:22:26 crc kubenswrapper[4698]: I0224 10:22:26.861982 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 24 10:22:26 crc kubenswrapper[4698]: I0224 10:22:26.904950 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 24 10:22:27 crc kubenswrapper[4698]: I0224 10:22:27.027601 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 24 10:22:27 crc kubenswrapper[4698]: I0224 10:22:27.029616 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 24 10:22:27 crc kubenswrapper[4698]: I0224 10:22:27.450520 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 24 10:22:27 crc kubenswrapper[4698]: I0224 10:22:27.543068 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 24 10:22:27 crc kubenswrapper[4698]: I0224 10:22:27.562578 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 24 10:22:27 crc kubenswrapper[4698]: I0224 10:22:27.591357 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 24 10:22:27 crc kubenswrapper[4698]: I0224 10:22:27.719877 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 24 10:22:28 crc kubenswrapper[4698]: I0224 10:22:28.008327 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 24 10:22:28 crc kubenswrapper[4698]: I0224 10:22:28.039666 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 24 10:22:28 crc kubenswrapper[4698]: I0224 10:22:28.301007 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 24 10:22:28 crc kubenswrapper[4698]: I0224 10:22:28.521974 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 24 10:22:28 crc kubenswrapper[4698]: I0224 10:22:28.583689 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 24 10:22:28 crc kubenswrapper[4698]: I0224 10:22:28.638422 4698 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 24 10:22:28 crc kubenswrapper[4698]: I0224 10:22:28.834937 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 24 10:22:29 crc kubenswrapper[4698]: I0224 10:22:29.453277 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 24 10:22:29 crc kubenswrapper[4698]: I0224 10:22:29.657250 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 24 10:22:29 crc kubenswrapper[4698]: I0224 10:22:29.723710 4698 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 24 10:22:29 crc kubenswrapper[4698]: I0224 10:22:29.723777 4698 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 24 10:22:29 crc kubenswrapper[4698]: I0224 10:22:29.731321 4698 ???:1] "http: TLS handshake error from 192.168.126.11:56506: no serving certificate available for the kubelet" Feb 24 10:22:29 crc kubenswrapper[4698]: I0224 10:22:29.739779 4698 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 24 10:22:29 crc kubenswrapper[4698]: I0224 10:22:29.739943 4698 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="8d93246e265f6291bed0b4633ebbd0554c888cc7bada79470ade0d28ef35c2f3" exitCode=137 Feb 24 10:22:29 crc kubenswrapper[4698]: I0224 10:22:29.739998 4698 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 24 10:22:29 crc kubenswrapper[4698]: I0224 10:22:29.740003 4698 scope.go:117] "RemoveContainer" containerID="8d93246e265f6291bed0b4633ebbd0554c888cc7bada79470ade0d28ef35c2f3" Feb 24 10:22:29 crc kubenswrapper[4698]: I0224 10:22:29.754531 4698 scope.go:117] "RemoveContainer" containerID="8d93246e265f6291bed0b4633ebbd0554c888cc7bada79470ade0d28ef35c2f3" Feb 24 10:22:29 crc kubenswrapper[4698]: E0224 10:22:29.754848 4698 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d93246e265f6291bed0b4633ebbd0554c888cc7bada79470ade0d28ef35c2f3\": container with ID starting with 8d93246e265f6291bed0b4633ebbd0554c888cc7bada79470ade0d28ef35c2f3 not found: ID does not exist" containerID="8d93246e265f6291bed0b4633ebbd0554c888cc7bada79470ade0d28ef35c2f3" Feb 24 10:22:29 crc kubenswrapper[4698]: I0224 10:22:29.754875 4698 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d93246e265f6291bed0b4633ebbd0554c888cc7bada79470ade0d28ef35c2f3"} err="failed to get container status \"8d93246e265f6291bed0b4633ebbd0554c888cc7bada79470ade0d28ef35c2f3\": rpc error: code = NotFound desc = could not find container \"8d93246e265f6291bed0b4633ebbd0554c888cc7bada79470ade0d28ef35c2f3\": container with ID starting with 8d93246e265f6291bed0b4633ebbd0554c888cc7bada79470ade0d28ef35c2f3 not found: ID does not exist" Feb 24 10:22:29 crc kubenswrapper[4698]: I0224 10:22:29.758743 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 24 10:22:29 crc kubenswrapper[4698]: I0224 10:22:29.827746 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 24 10:22:29 crc kubenswrapper[4698]: I0224 10:22:29.827813 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 24 10:22:29 crc kubenswrapper[4698]: I0224 10:22:29.827984 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 24 10:22:29 crc kubenswrapper[4698]: I0224 10:22:29.828041 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 24 10:22:29 crc kubenswrapper[4698]: I0224 10:22:29.828050 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 10:22:29 crc kubenswrapper[4698]: I0224 10:22:29.828083 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 24 10:22:29 crc kubenswrapper[4698]: I0224 10:22:29.828097 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 10:22:29 crc kubenswrapper[4698]: I0224 10:22:29.828117 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 10:22:29 crc kubenswrapper[4698]: I0224 10:22:29.828229 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 10:22:29 crc kubenswrapper[4698]: I0224 10:22:29.828484 4698 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Feb 24 10:22:29 crc kubenswrapper[4698]: I0224 10:22:29.828510 4698 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Feb 24 10:22:29 crc kubenswrapper[4698]: I0224 10:22:29.828529 4698 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 24 10:22:29 crc kubenswrapper[4698]: I0224 10:22:29.828545 4698 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Feb 24 10:22:29 crc kubenswrapper[4698]: I0224 10:22:29.835885 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 10:22:29 crc kubenswrapper[4698]: I0224 10:22:29.929534 4698 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 24 10:22:30 crc kubenswrapper[4698]: I0224 10:22:30.643043 4698 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 24 10:22:30 crc kubenswrapper[4698]: I0224 10:22:30.979961 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 24 10:22:31 crc kubenswrapper[4698]: I0224 10:22:31.626474 4698 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Feb 24 10:22:31 crc kubenswrapper[4698]: I0224 10:22:31.628516 4698 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Feb 24 10:22:31 crc kubenswrapper[4698]: I0224 10:22:31.643914 4698 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 24 10:22:31 crc kubenswrapper[4698]: I0224 10:22:31.643972 4698 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="63fb9a17-2692-45c3-8332-362dda9d75a2" Feb 24 10:22:31 crc kubenswrapper[4698]: I0224 10:22:31.650801 4698 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 24 10:22:31 crc kubenswrapper[4698]: I0224 10:22:31.650859 4698 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="63fb9a17-2692-45c3-8332-362dda9d75a2" Feb 24 10:22:43 crc kubenswrapper[4698]: I0224 10:22:43.305696 4698 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-p5dwb"] Feb 24 10:22:43 crc kubenswrapper[4698]: I0224 10:22:43.307448 4698 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-p5dwb" podUID="19022af1-394c-4aab-9eb1-ffb0f566d0ac" containerName="registry-server" containerID="cri-o://7f21bc1060193fd3dc0a8ba603e960a96ad825da9a8bc7332c1b998c3f99b59b" gracePeriod=30 Feb 24 10:22:43 crc kubenswrapper[4698]: I0224 10:22:43.315131 4698 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2z8bk"] Feb 24 10:22:43 crc kubenswrapper[4698]: I0224 10:22:43.315382 4698 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-2z8bk" podUID="5149fd4f-19d7-4852-b09a-d9909b8231dd" containerName="registry-server" containerID="cri-o://12504475d7d48a69d790441d24d5db1b06a5fc56a42debad4e922a6ebf5b8f0c" gracePeriod=30 Feb 24 10:22:43 crc kubenswrapper[4698]: I0224 10:22:43.323927 4698 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8hhkf"] Feb 24 10:22:43 crc kubenswrapper[4698]: I0224 10:22:43.324154 4698 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8hhkf" podUID="2eee2a16-171b-402e-9549-3d14cb56cddc" containerName="registry-server" containerID="cri-o://bfb3f47307f92e49a3096d724083678c00dbda225ab4067de2614a5c8f4a0a9f" gracePeriod=30 Feb 24 10:22:43 crc kubenswrapper[4698]: I0224 10:22:43.332539 4698 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-79f62"] Feb 24 10:22:43 crc kubenswrapper[4698]: I0224 10:22:43.332729 4698 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-79f62" podUID="d0de08e0-63c0-4a90-a264-1bc41b8746d8" containerName="marketplace-operator" containerID="cri-o://186142add921ff86958b67725d353a8e19fd4648ca6aa0e8301a4a1c96f0a1cd" gracePeriod=30 Feb 24 10:22:43 crc kubenswrapper[4698]: I0224 10:22:43.341522 4698 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dh74l"] Feb 24 10:22:43 crc kubenswrapper[4698]: I0224 10:22:43.341748 4698 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-dh74l" podUID="6f1af873-5e8f-4f75-81c2-c9b26ee37f2a" containerName="registry-server" containerID="cri-o://25696aece8ceb278543b675b47f84459a425f66214e17eb1ed294c82dd80be5c" gracePeriod=30 Feb 24 10:22:43 crc kubenswrapper[4698]: I0224 10:22:43.349487 4698 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-p9jbm"] Feb 24 10:22:43 crc kubenswrapper[4698]: I0224 10:22:43.349721 4698 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-p9jbm" podUID="25f4eaf1-6171-44dd-b225-be712a45ba1b" containerName="registry-server" containerID="cri-o://d27747ceecec130db4ae589810d43e4170d1292ad502837658213b005fb212a6" gracePeriod=30 Feb 24 10:22:43 crc kubenswrapper[4698]: I0224 10:22:43.355874 4698 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-l7zvp"] Feb 24 10:22:43 crc kubenswrapper[4698]: I0224 10:22:43.356142 4698 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-l7zvp" podUID="291ba94f-a9ac-4d5c-8476-221496078d80" containerName="registry-server" containerID="cri-o://65da67d67e3e604c84ef87c76aa8af86c17ca0c8c931acd6cf9fca161bdf1d56" gracePeriod=30 Feb 24 10:22:43 crc kubenswrapper[4698]: I0224 10:22:43.358905 4698 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ppmk4"] Feb 24 10:22:43 crc kubenswrapper[4698]: I0224 10:22:43.359169 4698 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ppmk4" podUID="55418747-8c79-496a-9b89-68f9eaa3f01a" containerName="registry-server" containerID="cri-o://3df0db11f4531eee7e3f2c5268ae565ca343226663e1165dc986a058e42f7c19" gracePeriod=30 Feb 24 10:22:43 crc kubenswrapper[4698]: E0224 10:22:43.643461 4698 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 25696aece8ceb278543b675b47f84459a425f66214e17eb1ed294c82dd80be5c is running failed: container process not found" containerID="25696aece8ceb278543b675b47f84459a425f66214e17eb1ed294c82dd80be5c" cmd=["grpc_health_probe","-addr=:50051"] Feb 24 10:22:43 crc kubenswrapper[4698]: E0224 10:22:43.643860 4698 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 25696aece8ceb278543b675b47f84459a425f66214e17eb1ed294c82dd80be5c is running failed: container process not found" containerID="25696aece8ceb278543b675b47f84459a425f66214e17eb1ed294c82dd80be5c" cmd=["grpc_health_probe","-addr=:50051"] Feb 24 10:22:43 crc kubenswrapper[4698]: E0224 10:22:43.644102 4698 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 25696aece8ceb278543b675b47f84459a425f66214e17eb1ed294c82dd80be5c is running failed: container process not found" containerID="25696aece8ceb278543b675b47f84459a425f66214e17eb1ed294c82dd80be5c" cmd=["grpc_health_probe","-addr=:50051"] Feb 24 10:22:43 crc kubenswrapper[4698]: E0224 10:22:43.644167 4698 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 25696aece8ceb278543b675b47f84459a425f66214e17eb1ed294c82dd80be5c is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-marketplace-dh74l" podUID="6f1af873-5e8f-4f75-81c2-c9b26ee37f2a" containerName="registry-server" Feb 24 10:22:43 crc kubenswrapper[4698]: I0224 10:22:43.691985 4698 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-79f62 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.25:8080/healthz\": dial tcp 10.217.0.25:8080: connect: connection refused" start-of-body= Feb 24 10:22:43 crc kubenswrapper[4698]: I0224 10:22:43.692043 4698 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-79f62" podUID="d0de08e0-63c0-4a90-a264-1bc41b8746d8" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.25:8080/healthz\": dial tcp 10.217.0.25:8080: connect: connection refused" Feb 24 10:22:43 crc kubenswrapper[4698]: I0224 10:22:43.829959 4698 generic.go:334] "Generic (PLEG): container finished" podID="2eee2a16-171b-402e-9549-3d14cb56cddc" containerID="bfb3f47307f92e49a3096d724083678c00dbda225ab4067de2614a5c8f4a0a9f" exitCode=0 Feb 24 10:22:43 crc kubenswrapper[4698]: I0224 10:22:43.830022 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8hhkf" event={"ID":"2eee2a16-171b-402e-9549-3d14cb56cddc","Type":"ContainerDied","Data":"bfb3f47307f92e49a3096d724083678c00dbda225ab4067de2614a5c8f4a0a9f"} Feb 24 10:22:43 crc kubenswrapper[4698]: I0224 10:22:43.832624 4698 generic.go:334] "Generic (PLEG): container finished" podID="19022af1-394c-4aab-9eb1-ffb0f566d0ac" containerID="7f21bc1060193fd3dc0a8ba603e960a96ad825da9a8bc7332c1b998c3f99b59b" exitCode=0 Feb 24 10:22:43 crc kubenswrapper[4698]: I0224 10:22:43.832682 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p5dwb" event={"ID":"19022af1-394c-4aab-9eb1-ffb0f566d0ac","Type":"ContainerDied","Data":"7f21bc1060193fd3dc0a8ba603e960a96ad825da9a8bc7332c1b998c3f99b59b"} Feb 24 10:22:43 crc kubenswrapper[4698]: I0224 10:22:43.835062 4698 generic.go:334] "Generic (PLEG): container finished" podID="291ba94f-a9ac-4d5c-8476-221496078d80" containerID="65da67d67e3e604c84ef87c76aa8af86c17ca0c8c931acd6cf9fca161bdf1d56" exitCode=0 Feb 24 10:22:43 crc kubenswrapper[4698]: I0224 10:22:43.835131 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l7zvp" event={"ID":"291ba94f-a9ac-4d5c-8476-221496078d80","Type":"ContainerDied","Data":"65da67d67e3e604c84ef87c76aa8af86c17ca0c8c931acd6cf9fca161bdf1d56"} Feb 24 10:22:43 crc kubenswrapper[4698]: I0224 10:22:43.838621 4698 generic.go:334] "Generic (PLEG): container finished" podID="6f1af873-5e8f-4f75-81c2-c9b26ee37f2a" containerID="25696aece8ceb278543b675b47f84459a425f66214e17eb1ed294c82dd80be5c" exitCode=0 Feb 24 10:22:43 crc kubenswrapper[4698]: I0224 10:22:43.838686 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dh74l" event={"ID":"6f1af873-5e8f-4f75-81c2-c9b26ee37f2a","Type":"ContainerDied","Data":"25696aece8ceb278543b675b47f84459a425f66214e17eb1ed294c82dd80be5c"} Feb 24 10:22:43 crc kubenswrapper[4698]: I0224 10:22:43.840513 4698 generic.go:334] "Generic (PLEG): container finished" podID="d0de08e0-63c0-4a90-a264-1bc41b8746d8" containerID="186142add921ff86958b67725d353a8e19fd4648ca6aa0e8301a4a1c96f0a1cd" exitCode=0 Feb 24 10:22:43 crc kubenswrapper[4698]: I0224 10:22:43.840574 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-79f62" event={"ID":"d0de08e0-63c0-4a90-a264-1bc41b8746d8","Type":"ContainerDied","Data":"186142add921ff86958b67725d353a8e19fd4648ca6aa0e8301a4a1c96f0a1cd"} Feb 24 10:22:43 crc kubenswrapper[4698]: I0224 10:22:43.842574 4698 generic.go:334] "Generic (PLEG): container finished" podID="5149fd4f-19d7-4852-b09a-d9909b8231dd" containerID="12504475d7d48a69d790441d24d5db1b06a5fc56a42debad4e922a6ebf5b8f0c" exitCode=0 Feb 24 10:22:43 crc kubenswrapper[4698]: I0224 10:22:43.842637 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2z8bk" event={"ID":"5149fd4f-19d7-4852-b09a-d9909b8231dd","Type":"ContainerDied","Data":"12504475d7d48a69d790441d24d5db1b06a5fc56a42debad4e922a6ebf5b8f0c"} Feb 24 10:22:43 crc kubenswrapper[4698]: I0224 10:22:43.844599 4698 generic.go:334] "Generic (PLEG): container finished" podID="25f4eaf1-6171-44dd-b225-be712a45ba1b" containerID="d27747ceecec130db4ae589810d43e4170d1292ad502837658213b005fb212a6" exitCode=0 Feb 24 10:22:43 crc kubenswrapper[4698]: I0224 10:22:43.844622 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p9jbm" event={"ID":"25f4eaf1-6171-44dd-b225-be712a45ba1b","Type":"ContainerDied","Data":"d27747ceecec130db4ae589810d43e4170d1292ad502837658213b005fb212a6"} Feb 24 10:22:44 crc kubenswrapper[4698]: E0224 10:22:44.171501 4698 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3df0db11f4531eee7e3f2c5268ae565ca343226663e1165dc986a058e42f7c19 is running failed: container process not found" containerID="3df0db11f4531eee7e3f2c5268ae565ca343226663e1165dc986a058e42f7c19" cmd=["grpc_health_probe","-addr=:50051"] Feb 24 10:22:44 crc kubenswrapper[4698]: E0224 10:22:44.171850 4698 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3df0db11f4531eee7e3f2c5268ae565ca343226663e1165dc986a058e42f7c19 is running failed: container process not found" containerID="3df0db11f4531eee7e3f2c5268ae565ca343226663e1165dc986a058e42f7c19" cmd=["grpc_health_probe","-addr=:50051"] Feb 24 10:22:44 crc kubenswrapper[4698]: E0224 10:22:44.172137 4698 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3df0db11f4531eee7e3f2c5268ae565ca343226663e1165dc986a058e42f7c19 is running failed: container process not found" containerID="3df0db11f4531eee7e3f2c5268ae565ca343226663e1165dc986a058e42f7c19" cmd=["grpc_health_probe","-addr=:50051"] Feb 24 10:22:44 crc kubenswrapper[4698]: E0224 10:22:44.172171 4698 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 3df0db11f4531eee7e3f2c5268ae565ca343226663e1165dc986a058e42f7c19 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-operators-ppmk4" podUID="55418747-8c79-496a-9b89-68f9eaa3f01a" containerName="registry-server" Feb 24 10:22:44 crc kubenswrapper[4698]: E0224 10:22:44.600157 4698 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 65da67d67e3e604c84ef87c76aa8af86c17ca0c8c931acd6cf9fca161bdf1d56 is running failed: container process not found" containerID="65da67d67e3e604c84ef87c76aa8af86c17ca0c8c931acd6cf9fca161bdf1d56" cmd=["grpc_health_probe","-addr=:50051"] Feb 24 10:22:44 crc kubenswrapper[4698]: E0224 10:22:44.600644 4698 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 65da67d67e3e604c84ef87c76aa8af86c17ca0c8c931acd6cf9fca161bdf1d56 is running failed: container process not found" containerID="65da67d67e3e604c84ef87c76aa8af86c17ca0c8c931acd6cf9fca161bdf1d56" cmd=["grpc_health_probe","-addr=:50051"] Feb 24 10:22:44 crc kubenswrapper[4698]: E0224 10:22:44.600935 4698 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 65da67d67e3e604c84ef87c76aa8af86c17ca0c8c931acd6cf9fca161bdf1d56 is running failed: container process not found" containerID="65da67d67e3e604c84ef87c76aa8af86c17ca0c8c931acd6cf9fca161bdf1d56" cmd=["grpc_health_probe","-addr=:50051"] Feb 24 10:22:44 crc kubenswrapper[4698]: E0224 10:22:44.600966 4698 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 65da67d67e3e604c84ef87c76aa8af86c17ca0c8c931acd6cf9fca161bdf1d56 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-operators-l7zvp" podUID="291ba94f-a9ac-4d5c-8476-221496078d80" containerName="registry-server" Feb 24 10:22:44 crc kubenswrapper[4698]: I0224 10:22:44.772515 4698 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l7zvp" Feb 24 10:22:44 crc kubenswrapper[4698]: I0224 10:22:44.842233 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/291ba94f-a9ac-4d5c-8476-221496078d80-catalog-content\") pod \"291ba94f-a9ac-4d5c-8476-221496078d80\" (UID: \"291ba94f-a9ac-4d5c-8476-221496078d80\") " Feb 24 10:22:44 crc kubenswrapper[4698]: I0224 10:22:44.842343 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xn7xk\" (UniqueName: \"kubernetes.io/projected/291ba94f-a9ac-4d5c-8476-221496078d80-kube-api-access-xn7xk\") pod \"291ba94f-a9ac-4d5c-8476-221496078d80\" (UID: \"291ba94f-a9ac-4d5c-8476-221496078d80\") " Feb 24 10:22:44 crc kubenswrapper[4698]: I0224 10:22:44.842387 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/291ba94f-a9ac-4d5c-8476-221496078d80-utilities\") pod \"291ba94f-a9ac-4d5c-8476-221496078d80\" (UID: \"291ba94f-a9ac-4d5c-8476-221496078d80\") " Feb 24 10:22:44 crc kubenswrapper[4698]: I0224 10:22:44.843185 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/291ba94f-a9ac-4d5c-8476-221496078d80-utilities" (OuterVolumeSpecName: "utilities") pod "291ba94f-a9ac-4d5c-8476-221496078d80" (UID: "291ba94f-a9ac-4d5c-8476-221496078d80"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 10:22:44 crc kubenswrapper[4698]: I0224 10:22:44.850385 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/291ba94f-a9ac-4d5c-8476-221496078d80-kube-api-access-xn7xk" (OuterVolumeSpecName: "kube-api-access-xn7xk") pod "291ba94f-a9ac-4d5c-8476-221496078d80" (UID: "291ba94f-a9ac-4d5c-8476-221496078d80"). InnerVolumeSpecName "kube-api-access-xn7xk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:22:44 crc kubenswrapper[4698]: I0224 10:22:44.860543 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l7zvp" event={"ID":"291ba94f-a9ac-4d5c-8476-221496078d80","Type":"ContainerDied","Data":"d45dbe795bfedd1e2eeb45e669c0614087eac6502f1b8e64c74ed3c71c9ed10a"} Feb 24 10:22:44 crc kubenswrapper[4698]: I0224 10:22:44.860596 4698 scope.go:117] "RemoveContainer" containerID="65da67d67e3e604c84ef87c76aa8af86c17ca0c8c931acd6cf9fca161bdf1d56" Feb 24 10:22:44 crc kubenswrapper[4698]: I0224 10:22:44.860724 4698 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l7zvp" Feb 24 10:22:44 crc kubenswrapper[4698]: I0224 10:22:44.865216 4698 generic.go:334] "Generic (PLEG): container finished" podID="55418747-8c79-496a-9b89-68f9eaa3f01a" containerID="3df0db11f4531eee7e3f2c5268ae565ca343226663e1165dc986a058e42f7c19" exitCode=0 Feb 24 10:22:44 crc kubenswrapper[4698]: I0224 10:22:44.865252 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ppmk4" event={"ID":"55418747-8c79-496a-9b89-68f9eaa3f01a","Type":"ContainerDied","Data":"3df0db11f4531eee7e3f2c5268ae565ca343226663e1165dc986a058e42f7c19"} Feb 24 10:22:44 crc kubenswrapper[4698]: I0224 10:22:44.917986 4698 scope.go:117] "RemoveContainer" containerID="47883b80e42c225646e36edfeb6751002aed2fe48b26c67e1113ba52f3dc7715" Feb 24 10:22:44 crc kubenswrapper[4698]: I0224 10:22:44.943854 4698 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/291ba94f-a9ac-4d5c-8476-221496078d80-utilities\") on node \"crc\" DevicePath \"\"" Feb 24 10:22:44 crc kubenswrapper[4698]: I0224 10:22:44.943881 4698 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xn7xk\" (UniqueName: \"kubernetes.io/projected/291ba94f-a9ac-4d5c-8476-221496078d80-kube-api-access-xn7xk\") on node \"crc\" DevicePath \"\"" Feb 24 10:22:44 crc kubenswrapper[4698]: I0224 10:22:44.947442 4698 scope.go:117] "RemoveContainer" containerID="6a2b14809b5503e2af70b5402fd1ed78253f7bc05b307b8c2d9edae83978c36a" Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.073616 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/291ba94f-a9ac-4d5c-8476-221496078d80-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "291ba94f-a9ac-4d5c-8476-221496078d80" (UID: "291ba94f-a9ac-4d5c-8476-221496078d80"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.088585 4698 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p9jbm" Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.121761 4698 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2z8bk" Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.147181 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5149fd4f-19d7-4852-b09a-d9909b8231dd-catalog-content\") pod \"5149fd4f-19d7-4852-b09a-d9909b8231dd\" (UID: \"5149fd4f-19d7-4852-b09a-d9909b8231dd\") " Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.147283 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w26kv\" (UniqueName: \"kubernetes.io/projected/5149fd4f-19d7-4852-b09a-d9909b8231dd-kube-api-access-w26kv\") pod \"5149fd4f-19d7-4852-b09a-d9909b8231dd\" (UID: \"5149fd4f-19d7-4852-b09a-d9909b8231dd\") " Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.147349 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mq2ws\" (UniqueName: \"kubernetes.io/projected/25f4eaf1-6171-44dd-b225-be712a45ba1b-kube-api-access-mq2ws\") pod \"25f4eaf1-6171-44dd-b225-be712a45ba1b\" (UID: \"25f4eaf1-6171-44dd-b225-be712a45ba1b\") " Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.147434 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25f4eaf1-6171-44dd-b225-be712a45ba1b-catalog-content\") pod \"25f4eaf1-6171-44dd-b225-be712a45ba1b\" (UID: \"25f4eaf1-6171-44dd-b225-be712a45ba1b\") " Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.147463 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25f4eaf1-6171-44dd-b225-be712a45ba1b-utilities\") pod \"25f4eaf1-6171-44dd-b225-be712a45ba1b\" (UID: \"25f4eaf1-6171-44dd-b225-be712a45ba1b\") " Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.147480 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5149fd4f-19d7-4852-b09a-d9909b8231dd-utilities\") pod \"5149fd4f-19d7-4852-b09a-d9909b8231dd\" (UID: \"5149fd4f-19d7-4852-b09a-d9909b8231dd\") " Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.147650 4698 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/291ba94f-a9ac-4d5c-8476-221496078d80-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.148344 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5149fd4f-19d7-4852-b09a-d9909b8231dd-utilities" (OuterVolumeSpecName: "utilities") pod "5149fd4f-19d7-4852-b09a-d9909b8231dd" (UID: "5149fd4f-19d7-4852-b09a-d9909b8231dd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.160973 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25f4eaf1-6171-44dd-b225-be712a45ba1b-utilities" (OuterVolumeSpecName: "utilities") pod "25f4eaf1-6171-44dd-b225-be712a45ba1b" (UID: "25f4eaf1-6171-44dd-b225-be712a45ba1b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.189617 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25f4eaf1-6171-44dd-b225-be712a45ba1b-kube-api-access-mq2ws" (OuterVolumeSpecName: "kube-api-access-mq2ws") pod "25f4eaf1-6171-44dd-b225-be712a45ba1b" (UID: "25f4eaf1-6171-44dd-b225-be712a45ba1b"). InnerVolumeSpecName "kube-api-access-mq2ws". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.210797 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5149fd4f-19d7-4852-b09a-d9909b8231dd-kube-api-access-w26kv" (OuterVolumeSpecName: "kube-api-access-w26kv") pod "5149fd4f-19d7-4852-b09a-d9909b8231dd" (UID: "5149fd4f-19d7-4852-b09a-d9909b8231dd"). InnerVolumeSpecName "kube-api-access-w26kv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.212302 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25f4eaf1-6171-44dd-b225-be712a45ba1b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "25f4eaf1-6171-44dd-b225-be712a45ba1b" (UID: "25f4eaf1-6171-44dd-b225-be712a45ba1b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.250232 4698 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-l7zvp"] Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.254011 4698 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/25f4eaf1-6171-44dd-b225-be712a45ba1b-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.254102 4698 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/25f4eaf1-6171-44dd-b225-be712a45ba1b-utilities\") on node \"crc\" DevicePath \"\"" Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.254112 4698 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5149fd4f-19d7-4852-b09a-d9909b8231dd-utilities\") on node \"crc\" DevicePath \"\"" Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.254121 4698 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w26kv\" (UniqueName: \"kubernetes.io/projected/5149fd4f-19d7-4852-b09a-d9909b8231dd-kube-api-access-w26kv\") on node \"crc\" DevicePath \"\"" Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.254130 4698 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mq2ws\" (UniqueName: \"kubernetes.io/projected/25f4eaf1-6171-44dd-b225-be712a45ba1b-kube-api-access-mq2ws\") on node \"crc\" DevicePath \"\"" Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.260208 4698 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-l7zvp"] Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.280782 4698 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ppmk4" Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.289339 4698 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zdlkh"] Feb 24 10:22:45 crc kubenswrapper[4698]: E0224 10:22:45.289812 4698 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25f4eaf1-6171-44dd-b225-be712a45ba1b" containerName="extract-content" Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.289913 4698 state_mem.go:107] "Deleted CPUSet assignment" podUID="25f4eaf1-6171-44dd-b225-be712a45ba1b" containerName="extract-content" Feb 24 10:22:45 crc kubenswrapper[4698]: E0224 10:22:45.289998 4698 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55418747-8c79-496a-9b89-68f9eaa3f01a" containerName="registry-server" Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.290072 4698 state_mem.go:107] "Deleted CPUSet assignment" podUID="55418747-8c79-496a-9b89-68f9eaa3f01a" containerName="registry-server" Feb 24 10:22:45 crc kubenswrapper[4698]: E0224 10:22:45.290137 4698 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="291ba94f-a9ac-4d5c-8476-221496078d80" containerName="extract-utilities" Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.290204 4698 state_mem.go:107] "Deleted CPUSet assignment" podUID="291ba94f-a9ac-4d5c-8476-221496078d80" containerName="extract-utilities" Feb 24 10:22:45 crc kubenswrapper[4698]: E0224 10:22:45.290303 4698 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.290429 4698 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 24 10:22:45 crc kubenswrapper[4698]: E0224 10:22:45.290509 4698 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="291ba94f-a9ac-4d5c-8476-221496078d80" containerName="registry-server" Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.290565 4698 state_mem.go:107] "Deleted CPUSet assignment" podUID="291ba94f-a9ac-4d5c-8476-221496078d80" containerName="registry-server" Feb 24 10:22:45 crc kubenswrapper[4698]: E0224 10:22:45.290618 4698 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5149fd4f-19d7-4852-b09a-d9909b8231dd" containerName="extract-utilities" Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.290677 4698 state_mem.go:107] "Deleted CPUSet assignment" podUID="5149fd4f-19d7-4852-b09a-d9909b8231dd" containerName="extract-utilities" Feb 24 10:22:45 crc kubenswrapper[4698]: E0224 10:22:45.290732 4698 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25f4eaf1-6171-44dd-b225-be712a45ba1b" containerName="extract-utilities" Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.290788 4698 state_mem.go:107] "Deleted CPUSet assignment" podUID="25f4eaf1-6171-44dd-b225-be712a45ba1b" containerName="extract-utilities" Feb 24 10:22:45 crc kubenswrapper[4698]: E0224 10:22:45.290843 4698 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5149fd4f-19d7-4852-b09a-d9909b8231dd" containerName="extract-content" Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.290895 4698 state_mem.go:107] "Deleted CPUSet assignment" podUID="5149fd4f-19d7-4852-b09a-d9909b8231dd" containerName="extract-content" Feb 24 10:22:45 crc kubenswrapper[4698]: E0224 10:22:45.290965 4698 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="291ba94f-a9ac-4d5c-8476-221496078d80" containerName="extract-content" Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.291028 4698 state_mem.go:107] "Deleted CPUSet assignment" podUID="291ba94f-a9ac-4d5c-8476-221496078d80" containerName="extract-content" Feb 24 10:22:45 crc kubenswrapper[4698]: E0224 10:22:45.291087 4698 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d03f18c4-57c2-4d45-9b43-0b4fbf8f4a41" containerName="installer" Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.291153 4698 state_mem.go:107] "Deleted CPUSet assignment" podUID="d03f18c4-57c2-4d45-9b43-0b4fbf8f4a41" containerName="installer" Feb 24 10:22:45 crc kubenswrapper[4698]: E0224 10:22:45.291233 4698 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25f4eaf1-6171-44dd-b225-be712a45ba1b" containerName="registry-server" Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.291312 4698 state_mem.go:107] "Deleted CPUSet assignment" podUID="25f4eaf1-6171-44dd-b225-be712a45ba1b" containerName="registry-server" Feb 24 10:22:45 crc kubenswrapper[4698]: E0224 10:22:45.291371 4698 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5149fd4f-19d7-4852-b09a-d9909b8231dd" containerName="registry-server" Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.291431 4698 state_mem.go:107] "Deleted CPUSet assignment" podUID="5149fd4f-19d7-4852-b09a-d9909b8231dd" containerName="registry-server" Feb 24 10:22:45 crc kubenswrapper[4698]: E0224 10:22:45.291504 4698 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55418747-8c79-496a-9b89-68f9eaa3f01a" containerName="extract-content" Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.291588 4698 state_mem.go:107] "Deleted CPUSet assignment" podUID="55418747-8c79-496a-9b89-68f9eaa3f01a" containerName="extract-content" Feb 24 10:22:45 crc kubenswrapper[4698]: E0224 10:22:45.291667 4698 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55418747-8c79-496a-9b89-68f9eaa3f01a" containerName="extract-utilities" Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.291736 4698 state_mem.go:107] "Deleted CPUSet assignment" podUID="55418747-8c79-496a-9b89-68f9eaa3f01a" containerName="extract-utilities" Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.291936 4698 memory_manager.go:354] "RemoveStaleState removing state" podUID="d03f18c4-57c2-4d45-9b43-0b4fbf8f4a41" containerName="installer" Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.292025 4698 memory_manager.go:354] "RemoveStaleState removing state" podUID="291ba94f-a9ac-4d5c-8476-221496078d80" containerName="registry-server" Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.292097 4698 memory_manager.go:354] "RemoveStaleState removing state" podUID="55418747-8c79-496a-9b89-68f9eaa3f01a" containerName="registry-server" Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.292795 4698 memory_manager.go:354] "RemoveStaleState removing state" podUID="25f4eaf1-6171-44dd-b225-be712a45ba1b" containerName="registry-server" Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.292897 4698 memory_manager.go:354] "RemoveStaleState removing state" podUID="5149fd4f-19d7-4852-b09a-d9909b8231dd" containerName="registry-server" Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.292975 4698 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.293548 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-zdlkh" Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.307622 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5149fd4f-19d7-4852-b09a-d9909b8231dd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5149fd4f-19d7-4852-b09a-d9909b8231dd" (UID: "5149fd4f-19d7-4852-b09a-d9909b8231dd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.309776 4698 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-85cb469ccf-nndrr"] Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.310029 4698 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-85cb469ccf-nndrr" podUID="e266bb2f-40eb-4da2-9767-0a300c8dc27b" containerName="controller-manager" containerID="cri-o://ae84a11824584df4e5d291b39c126f5906f6363108f41233ce773636ee70284e" gracePeriod=30 Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.314664 4698 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-695f7b9b5-pqzw6"] Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.316306 4698 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-695f7b9b5-pqzw6" podUID="101b40a6-d373-47b1-83f5-b5bf8bd579c8" containerName="route-controller-manager" containerID="cri-o://2c1907d8cbed72fda1c1b6eb1876b793cc232f9ae5d9a1a1d1146c8ac68bcc9e" gracePeriod=30 Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.323063 4698 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8hhkf" Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.333047 4698 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dh74l" Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.334640 4698 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-79f62" Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.334994 4698 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p5dwb" Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.337966 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zdlkh"] Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.354723 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2eee2a16-171b-402e-9549-3d14cb56cddc-utilities\") pod \"2eee2a16-171b-402e-9549-3d14cb56cddc\" (UID: \"2eee2a16-171b-402e-9549-3d14cb56cddc\") " Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.354911 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2eee2a16-171b-402e-9549-3d14cb56cddc-catalog-content\") pod \"2eee2a16-171b-402e-9549-3d14cb56cddc\" (UID: \"2eee2a16-171b-402e-9549-3d14cb56cddc\") " Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.354985 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55418747-8c79-496a-9b89-68f9eaa3f01a-catalog-content\") pod \"55418747-8c79-496a-9b89-68f9eaa3f01a\" (UID: \"55418747-8c79-496a-9b89-68f9eaa3f01a\") " Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.355099 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zz7zw\" (UniqueName: \"kubernetes.io/projected/2eee2a16-171b-402e-9549-3d14cb56cddc-kube-api-access-zz7zw\") pod \"2eee2a16-171b-402e-9549-3d14cb56cddc\" (UID: \"2eee2a16-171b-402e-9549-3d14cb56cddc\") " Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.355203 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfcd4\" (UniqueName: \"kubernetes.io/projected/55418747-8c79-496a-9b89-68f9eaa3f01a-kube-api-access-cfcd4\") pod \"55418747-8c79-496a-9b89-68f9eaa3f01a\" (UID: \"55418747-8c79-496a-9b89-68f9eaa3f01a\") " Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.355305 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55418747-8c79-496a-9b89-68f9eaa3f01a-utilities\") pod \"55418747-8c79-496a-9b89-68f9eaa3f01a\" (UID: \"55418747-8c79-496a-9b89-68f9eaa3f01a\") " Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.355432 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2eee2a16-171b-402e-9549-3d14cb56cddc-utilities" (OuterVolumeSpecName: "utilities") pod "2eee2a16-171b-402e-9549-3d14cb56cddc" (UID: "2eee2a16-171b-402e-9549-3d14cb56cddc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.355559 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgmtm\" (UniqueName: \"kubernetes.io/projected/d21d0539-1e5b-4c6a-b35b-49effbb595b1-kube-api-access-pgmtm\") pod \"marketplace-operator-79b997595-zdlkh\" (UID: \"d21d0539-1e5b-4c6a-b35b-49effbb595b1\") " pod="openshift-marketplace/marketplace-operator-79b997595-zdlkh" Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.355742 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d21d0539-1e5b-4c6a-b35b-49effbb595b1-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-zdlkh\" (UID: \"d21d0539-1e5b-4c6a-b35b-49effbb595b1\") " pod="openshift-marketplace/marketplace-operator-79b997595-zdlkh" Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.355880 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d21d0539-1e5b-4c6a-b35b-49effbb595b1-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-zdlkh\" (UID: \"d21d0539-1e5b-4c6a-b35b-49effbb595b1\") " pod="openshift-marketplace/marketplace-operator-79b997595-zdlkh" Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.355991 4698 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2eee2a16-171b-402e-9549-3d14cb56cddc-utilities\") on node \"crc\" DevicePath \"\"" Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.356061 4698 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5149fd4f-19d7-4852-b09a-d9909b8231dd-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.356386 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55418747-8c79-496a-9b89-68f9eaa3f01a-utilities" (OuterVolumeSpecName: "utilities") pod "55418747-8c79-496a-9b89-68f9eaa3f01a" (UID: "55418747-8c79-496a-9b89-68f9eaa3f01a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.361249 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55418747-8c79-496a-9b89-68f9eaa3f01a-kube-api-access-cfcd4" (OuterVolumeSpecName: "kube-api-access-cfcd4") pod "55418747-8c79-496a-9b89-68f9eaa3f01a" (UID: "55418747-8c79-496a-9b89-68f9eaa3f01a"). InnerVolumeSpecName "kube-api-access-cfcd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.366305 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2eee2a16-171b-402e-9549-3d14cb56cddc-kube-api-access-zz7zw" (OuterVolumeSpecName: "kube-api-access-zz7zw") pod "2eee2a16-171b-402e-9549-3d14cb56cddc" (UID: "2eee2a16-171b-402e-9549-3d14cb56cddc"). InnerVolumeSpecName "kube-api-access-zz7zw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.448927 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2eee2a16-171b-402e-9549-3d14cb56cddc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2eee2a16-171b-402e-9549-3d14cb56cddc" (UID: "2eee2a16-171b-402e-9549-3d14cb56cddc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.457218 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19022af1-394c-4aab-9eb1-ffb0f566d0ac-catalog-content\") pod \"19022af1-394c-4aab-9eb1-ffb0f566d0ac\" (UID: \"19022af1-394c-4aab-9eb1-ffb0f566d0ac\") " Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.457480 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b27gl\" (UniqueName: \"kubernetes.io/projected/19022af1-394c-4aab-9eb1-ffb0f566d0ac-kube-api-access-b27gl\") pod \"19022af1-394c-4aab-9eb1-ffb0f566d0ac\" (UID: \"19022af1-394c-4aab-9eb1-ffb0f566d0ac\") " Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.457640 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jjsf5\" (UniqueName: \"kubernetes.io/projected/6f1af873-5e8f-4f75-81c2-c9b26ee37f2a-kube-api-access-jjsf5\") pod \"6f1af873-5e8f-4f75-81c2-c9b26ee37f2a\" (UID: \"6f1af873-5e8f-4f75-81c2-c9b26ee37f2a\") " Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.457747 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f1af873-5e8f-4f75-81c2-c9b26ee37f2a-utilities\") pod \"6f1af873-5e8f-4f75-81c2-c9b26ee37f2a\" (UID: \"6f1af873-5e8f-4f75-81c2-c9b26ee37f2a\") " Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.457831 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d0de08e0-63c0-4a90-a264-1bc41b8746d8-marketplace-trusted-ca\") pod \"d0de08e0-63c0-4a90-a264-1bc41b8746d8\" (UID: \"d0de08e0-63c0-4a90-a264-1bc41b8746d8\") " Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.457950 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d0de08e0-63c0-4a90-a264-1bc41b8746d8-marketplace-operator-metrics\") pod \"d0de08e0-63c0-4a90-a264-1bc41b8746d8\" (UID: \"d0de08e0-63c0-4a90-a264-1bc41b8746d8\") " Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.458040 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19022af1-394c-4aab-9eb1-ffb0f566d0ac-utilities\") pod \"19022af1-394c-4aab-9eb1-ffb0f566d0ac\" (UID: \"19022af1-394c-4aab-9eb1-ffb0f566d0ac\") " Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.458136 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f1af873-5e8f-4f75-81c2-c9b26ee37f2a-catalog-content\") pod \"6f1af873-5e8f-4f75-81c2-c9b26ee37f2a\" (UID: \"6f1af873-5e8f-4f75-81c2-c9b26ee37f2a\") " Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.458243 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kvmx4\" (UniqueName: \"kubernetes.io/projected/d0de08e0-63c0-4a90-a264-1bc41b8746d8-kube-api-access-kvmx4\") pod \"d0de08e0-63c0-4a90-a264-1bc41b8746d8\" (UID: \"d0de08e0-63c0-4a90-a264-1bc41b8746d8\") " Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.458982 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f1af873-5e8f-4f75-81c2-c9b26ee37f2a-utilities" (OuterVolumeSpecName: "utilities") pod "6f1af873-5e8f-4f75-81c2-c9b26ee37f2a" (UID: "6f1af873-5e8f-4f75-81c2-c9b26ee37f2a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.461294 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgmtm\" (UniqueName: \"kubernetes.io/projected/d21d0539-1e5b-4c6a-b35b-49effbb595b1-kube-api-access-pgmtm\") pod \"marketplace-operator-79b997595-zdlkh\" (UID: \"d21d0539-1e5b-4c6a-b35b-49effbb595b1\") " pod="openshift-marketplace/marketplace-operator-79b997595-zdlkh" Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.461434 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d21d0539-1e5b-4c6a-b35b-49effbb595b1-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-zdlkh\" (UID: \"d21d0539-1e5b-4c6a-b35b-49effbb595b1\") " pod="openshift-marketplace/marketplace-operator-79b997595-zdlkh" Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.461542 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d21d0539-1e5b-4c6a-b35b-49effbb595b1-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-zdlkh\" (UID: \"d21d0539-1e5b-4c6a-b35b-49effbb595b1\") " pod="openshift-marketplace/marketplace-operator-79b997595-zdlkh" Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.461663 4698 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2eee2a16-171b-402e-9549-3d14cb56cddc-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.461730 4698 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f1af873-5e8f-4f75-81c2-c9b26ee37f2a-utilities\") on node \"crc\" DevicePath \"\"" Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.461789 4698 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zz7zw\" (UniqueName: \"kubernetes.io/projected/2eee2a16-171b-402e-9549-3d14cb56cddc-kube-api-access-zz7zw\") on node \"crc\" DevicePath \"\"" Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.461853 4698 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfcd4\" (UniqueName: \"kubernetes.io/projected/55418747-8c79-496a-9b89-68f9eaa3f01a-kube-api-access-cfcd4\") on node \"crc\" DevicePath \"\"" Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.461917 4698 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55418747-8c79-496a-9b89-68f9eaa3f01a-utilities\") on node \"crc\" DevicePath \"\"" Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.463113 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0de08e0-63c0-4a90-a264-1bc41b8746d8-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "d0de08e0-63c0-4a90-a264-1bc41b8746d8" (UID: "d0de08e0-63c0-4a90-a264-1bc41b8746d8"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.463907 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19022af1-394c-4aab-9eb1-ffb0f566d0ac-kube-api-access-b27gl" (OuterVolumeSpecName: "kube-api-access-b27gl") pod "19022af1-394c-4aab-9eb1-ffb0f566d0ac" (UID: "19022af1-394c-4aab-9eb1-ffb0f566d0ac"). InnerVolumeSpecName "kube-api-access-b27gl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.466700 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0de08e0-63c0-4a90-a264-1bc41b8746d8-kube-api-access-kvmx4" (OuterVolumeSpecName: "kube-api-access-kvmx4") pod "d0de08e0-63c0-4a90-a264-1bc41b8746d8" (UID: "d0de08e0-63c0-4a90-a264-1bc41b8746d8"). InnerVolumeSpecName "kube-api-access-kvmx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.464363 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19022af1-394c-4aab-9eb1-ffb0f566d0ac-utilities" (OuterVolumeSpecName: "utilities") pod "19022af1-394c-4aab-9eb1-ffb0f566d0ac" (UID: "19022af1-394c-4aab-9eb1-ffb0f566d0ac"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.470908 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d21d0539-1e5b-4c6a-b35b-49effbb595b1-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-zdlkh\" (UID: \"d21d0539-1e5b-4c6a-b35b-49effbb595b1\") " pod="openshift-marketplace/marketplace-operator-79b997595-zdlkh" Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.473056 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0de08e0-63c0-4a90-a264-1bc41b8746d8-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "d0de08e0-63c0-4a90-a264-1bc41b8746d8" (UID: "d0de08e0-63c0-4a90-a264-1bc41b8746d8"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.473519 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f1af873-5e8f-4f75-81c2-c9b26ee37f2a-kube-api-access-jjsf5" (OuterVolumeSpecName: "kube-api-access-jjsf5") pod "6f1af873-5e8f-4f75-81c2-c9b26ee37f2a" (UID: "6f1af873-5e8f-4f75-81c2-c9b26ee37f2a"). InnerVolumeSpecName "kube-api-access-jjsf5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.474333 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d21d0539-1e5b-4c6a-b35b-49effbb595b1-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-zdlkh\" (UID: \"d21d0539-1e5b-4c6a-b35b-49effbb595b1\") " pod="openshift-marketplace/marketplace-operator-79b997595-zdlkh" Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.481438 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgmtm\" (UniqueName: \"kubernetes.io/projected/d21d0539-1e5b-4c6a-b35b-49effbb595b1-kube-api-access-pgmtm\") pod \"marketplace-operator-79b997595-zdlkh\" (UID: \"d21d0539-1e5b-4c6a-b35b-49effbb595b1\") " pod="openshift-marketplace/marketplace-operator-79b997595-zdlkh" Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.509157 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f1af873-5e8f-4f75-81c2-c9b26ee37f2a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6f1af873-5e8f-4f75-81c2-c9b26ee37f2a" (UID: "6f1af873-5e8f-4f75-81c2-c9b26ee37f2a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.512665 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19022af1-394c-4aab-9eb1-ffb0f566d0ac-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "19022af1-394c-4aab-9eb1-ffb0f566d0ac" (UID: "19022af1-394c-4aab-9eb1-ffb0f566d0ac"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.563125 4698 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d0de08e0-63c0-4a90-a264-1bc41b8746d8-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.563159 4698 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19022af1-394c-4aab-9eb1-ffb0f566d0ac-utilities\") on node \"crc\" DevicePath \"\"" Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.563173 4698 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f1af873-5e8f-4f75-81c2-c9b26ee37f2a-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.563184 4698 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kvmx4\" (UniqueName: \"kubernetes.io/projected/d0de08e0-63c0-4a90-a264-1bc41b8746d8-kube-api-access-kvmx4\") on node \"crc\" DevicePath \"\"" Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.563195 4698 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19022af1-394c-4aab-9eb1-ffb0f566d0ac-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.563206 4698 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b27gl\" (UniqueName: \"kubernetes.io/projected/19022af1-394c-4aab-9eb1-ffb0f566d0ac-kube-api-access-b27gl\") on node \"crc\" DevicePath \"\"" Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.563218 4698 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jjsf5\" (UniqueName: \"kubernetes.io/projected/6f1af873-5e8f-4f75-81c2-c9b26ee37f2a-kube-api-access-jjsf5\") on node \"crc\" DevicePath \"\"" Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.563231 4698 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d0de08e0-63c0-4a90-a264-1bc41b8746d8-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.583852 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55418747-8c79-496a-9b89-68f9eaa3f01a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "55418747-8c79-496a-9b89-68f9eaa3f01a" (UID: "55418747-8c79-496a-9b89-68f9eaa3f01a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.620556 4698 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="291ba94f-a9ac-4d5c-8476-221496078d80" path="/var/lib/kubelet/pods/291ba94f-a9ac-4d5c-8476-221496078d80/volumes" Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.654013 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-zdlkh" Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.664041 4698 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55418747-8c79-496a-9b89-68f9eaa3f01a-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.874054 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8hhkf" event={"ID":"2eee2a16-171b-402e-9549-3d14cb56cddc","Type":"ContainerDied","Data":"472cf45ebe9ea39b507669fabcea23daff363164b2f5a031403b9d5d12c57ea3"} Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.874146 4698 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8hhkf" Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.874435 4698 scope.go:117] "RemoveContainer" containerID="bfb3f47307f92e49a3096d724083678c00dbda225ab4067de2614a5c8f4a0a9f" Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.878633 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-p5dwb" event={"ID":"19022af1-394c-4aab-9eb1-ffb0f566d0ac","Type":"ContainerDied","Data":"d9485b1a7189bca91d54d193b3de764840766a0cb914177787a5f204b8ad2a6c"} Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.878677 4698 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-p5dwb" Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.881926 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ppmk4" event={"ID":"55418747-8c79-496a-9b89-68f9eaa3f01a","Type":"ContainerDied","Data":"19df3b359278e17ef43c843a429e1d6694c29b28feddc379809fc373338c52b2"} Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.881999 4698 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ppmk4" Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.887004 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-79f62" event={"ID":"d0de08e0-63c0-4a90-a264-1bc41b8746d8","Type":"ContainerDied","Data":"cbd4ce812626fa202a05f89f1758daf6ee0a36e4e8b52e7242c48fc61220faa8"} Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.887073 4698 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-79f62" Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.890088 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2z8bk" event={"ID":"5149fd4f-19d7-4852-b09a-d9909b8231dd","Type":"ContainerDied","Data":"ae52dc38bccc1f74a86c16be9229ddbe43c140c2e39b5f1f23bcb0f790b206d8"} Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.890129 4698 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2z8bk" Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.892974 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p9jbm" event={"ID":"25f4eaf1-6171-44dd-b225-be712a45ba1b","Type":"ContainerDied","Data":"741cf2384aadeabb0a41bac00c46f80c21327dca34056e3677aaefbac4200336"} Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.893003 4698 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p9jbm" Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.893729 4698 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8hhkf"] Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.895557 4698 generic.go:334] "Generic (PLEG): container finished" podID="101b40a6-d373-47b1-83f5-b5bf8bd579c8" containerID="2c1907d8cbed72fda1c1b6eb1876b793cc232f9ae5d9a1a1d1146c8ac68bcc9e" exitCode=0 Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.895690 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-695f7b9b5-pqzw6" event={"ID":"101b40a6-d373-47b1-83f5-b5bf8bd579c8","Type":"ContainerDied","Data":"2c1907d8cbed72fda1c1b6eb1876b793cc232f9ae5d9a1a1d1146c8ac68bcc9e"} Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.896447 4698 scope.go:117] "RemoveContainer" containerID="6c6f018a2e8183a1e6015cbe5f46a1e17f6ef288d249975edb32c951a1517fd7" Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.898148 4698 generic.go:334] "Generic (PLEG): container finished" podID="e266bb2f-40eb-4da2-9767-0a300c8dc27b" containerID="ae84a11824584df4e5d291b39c126f5906f6363108f41233ce773636ee70284e" exitCode=0 Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.898168 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-85cb469ccf-nndrr" event={"ID":"e266bb2f-40eb-4da2-9767-0a300c8dc27b","Type":"ContainerDied","Data":"ae84a11824584df4e5d291b39c126f5906f6363108f41233ce773636ee70284e"} Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.900835 4698 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8hhkf"] Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.904160 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dh74l" event={"ID":"6f1af873-5e8f-4f75-81c2-c9b26ee37f2a","Type":"ContainerDied","Data":"dd26b6fb73cd2b0977541851f09e591a0eef46bc9736ba4dae9a64dafab09efa"} Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.904276 4698 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dh74l" Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.907292 4698 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ppmk4"] Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.917141 4698 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ppmk4"] Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.922701 4698 scope.go:117] "RemoveContainer" containerID="a45c86071d35edeee892f9ff893fd65800461fae44ce4d762643dd1b8709070f" Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.925124 4698 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-79f62"] Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.930182 4698 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-79f62"] Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.939793 4698 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-p5dwb"] Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.947053 4698 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-p5dwb"] Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.948553 4698 scope.go:117] "RemoveContainer" containerID="7f21bc1060193fd3dc0a8ba603e960a96ad825da9a8bc7332c1b998c3f99b59b" Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.948703 4698 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2z8bk"] Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.951837 4698 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-2z8bk"] Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.954958 4698 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-p9jbm"] Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.958221 4698 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-p9jbm"] Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.963186 4698 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dh74l"] Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.964535 4698 scope.go:117] "RemoveContainer" containerID="ad84e610cc0720a3fe4e7abe2493501d6e0d3b419be2245cbf762b13501dd835" Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.966619 4698 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-dh74l"] Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.983310 4698 scope.go:117] "RemoveContainer" containerID="cc4ae6b045233402d1e99fcc621d900e4d4a1f727d1d78289575c72a48379b26" Feb 24 10:22:45 crc kubenswrapper[4698]: I0224 10:22:45.996829 4698 scope.go:117] "RemoveContainer" containerID="3df0db11f4531eee7e3f2c5268ae565ca343226663e1165dc986a058e42f7c19" Feb 24 10:22:46 crc kubenswrapper[4698]: I0224 10:22:46.013417 4698 scope.go:117] "RemoveContainer" containerID="a6effa1b70ed0fb18c196662578693c690137e42eb7e7035900913f24f92668d" Feb 24 10:22:46 crc kubenswrapper[4698]: I0224 10:22:46.046510 4698 scope.go:117] "RemoveContainer" containerID="9d94e6b7b514c817e238b3de6823ddc2258e3d730b22674ceb4addc8ea7fa387" Feb 24 10:22:46 crc kubenswrapper[4698]: I0224 10:22:46.047617 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-zdlkh"] Feb 24 10:22:46 crc kubenswrapper[4698]: W0224 10:22:46.056391 4698 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd21d0539_1e5b_4c6a_b35b_49effbb595b1.slice/crio-96dccc77b087bfa844f19cc3ccda10a25f546d9ba87978911972ff4f5cd84dbc WatchSource:0}: Error finding container 96dccc77b087bfa844f19cc3ccda10a25f546d9ba87978911972ff4f5cd84dbc: Status 404 returned error can't find the container with id 96dccc77b087bfa844f19cc3ccda10a25f546d9ba87978911972ff4f5cd84dbc Feb 24 10:22:46 crc kubenswrapper[4698]: I0224 10:22:46.062390 4698 scope.go:117] "RemoveContainer" containerID="186142add921ff86958b67725d353a8e19fd4648ca6aa0e8301a4a1c96f0a1cd" Feb 24 10:22:46 crc kubenswrapper[4698]: I0224 10:22:46.104060 4698 scope.go:117] "RemoveContainer" containerID="12504475d7d48a69d790441d24d5db1b06a5fc56a42debad4e922a6ebf5b8f0c" Feb 24 10:22:46 crc kubenswrapper[4698]: I0224 10:22:46.134148 4698 scope.go:117] "RemoveContainer" containerID="55ee4801ffb6fe8d2af136c3de246e460ae965ded5eeb7dff258a9366a507f2e" Feb 24 10:22:46 crc kubenswrapper[4698]: I0224 10:22:46.156162 4698 scope.go:117] "RemoveContainer" containerID="6a7cd93f6b1dfe1bca4fa90156b6ee31171e9e134143bc5f66aae4f55b363fc0" Feb 24 10:22:46 crc kubenswrapper[4698]: I0224 10:22:46.193025 4698 scope.go:117] "RemoveContainer" containerID="d27747ceecec130db4ae589810d43e4170d1292ad502837658213b005fb212a6" Feb 24 10:22:46 crc kubenswrapper[4698]: I0224 10:22:46.209584 4698 scope.go:117] "RemoveContainer" containerID="3457dcc61e1e58199b3f7067d5f47baefd4b125660f958180282330d0adc4ee0" Feb 24 10:22:46 crc kubenswrapper[4698]: I0224 10:22:46.229946 4698 scope.go:117] "RemoveContainer" containerID="c8fbd1359619ee5da259f2643ad462ed01f041a307bbb27fcbe122d7565a7094" Feb 24 10:22:46 crc kubenswrapper[4698]: I0224 10:22:46.276967 4698 scope.go:117] "RemoveContainer" containerID="25696aece8ceb278543b675b47f84459a425f66214e17eb1ed294c82dd80be5c" Feb 24 10:22:46 crc kubenswrapper[4698]: I0224 10:22:46.294245 4698 scope.go:117] "RemoveContainer" containerID="bb77b755be2d1769cc0bb50f56c28187dbf774677a3e89580a42a9bf1f5d8981" Feb 24 10:22:46 crc kubenswrapper[4698]: I0224 10:22:46.324144 4698 scope.go:117] "RemoveContainer" containerID="3eac69b7cc348cccf17e267bf0b0792df72bd7ce9641816eebe4c22c26e5ce6d" Feb 24 10:22:46 crc kubenswrapper[4698]: I0224 10:22:46.366947 4698 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-695f7b9b5-pqzw6" Feb 24 10:22:46 crc kubenswrapper[4698]: I0224 10:22:46.472752 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/101b40a6-d373-47b1-83f5-b5bf8bd579c8-client-ca\") pod \"101b40a6-d373-47b1-83f5-b5bf8bd579c8\" (UID: \"101b40a6-d373-47b1-83f5-b5bf8bd579c8\") " Feb 24 10:22:46 crc kubenswrapper[4698]: I0224 10:22:46.472830 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/101b40a6-d373-47b1-83f5-b5bf8bd579c8-serving-cert\") pod \"101b40a6-d373-47b1-83f5-b5bf8bd579c8\" (UID: \"101b40a6-d373-47b1-83f5-b5bf8bd579c8\") " Feb 24 10:22:46 crc kubenswrapper[4698]: I0224 10:22:46.472871 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/101b40a6-d373-47b1-83f5-b5bf8bd579c8-config\") pod \"101b40a6-d373-47b1-83f5-b5bf8bd579c8\" (UID: \"101b40a6-d373-47b1-83f5-b5bf8bd579c8\") " Feb 24 10:22:46 crc kubenswrapper[4698]: I0224 10:22:46.472893 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xsvgh\" (UniqueName: \"kubernetes.io/projected/101b40a6-d373-47b1-83f5-b5bf8bd579c8-kube-api-access-xsvgh\") pod \"101b40a6-d373-47b1-83f5-b5bf8bd579c8\" (UID: \"101b40a6-d373-47b1-83f5-b5bf8bd579c8\") " Feb 24 10:22:46 crc kubenswrapper[4698]: I0224 10:22:46.473631 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/101b40a6-d373-47b1-83f5-b5bf8bd579c8-client-ca" (OuterVolumeSpecName: "client-ca") pod "101b40a6-d373-47b1-83f5-b5bf8bd579c8" (UID: "101b40a6-d373-47b1-83f5-b5bf8bd579c8"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:22:46 crc kubenswrapper[4698]: I0224 10:22:46.474122 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/101b40a6-d373-47b1-83f5-b5bf8bd579c8-config" (OuterVolumeSpecName: "config") pod "101b40a6-d373-47b1-83f5-b5bf8bd579c8" (UID: "101b40a6-d373-47b1-83f5-b5bf8bd579c8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:22:46 crc kubenswrapper[4698]: I0224 10:22:46.477106 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/101b40a6-d373-47b1-83f5-b5bf8bd579c8-kube-api-access-xsvgh" (OuterVolumeSpecName: "kube-api-access-xsvgh") pod "101b40a6-d373-47b1-83f5-b5bf8bd579c8" (UID: "101b40a6-d373-47b1-83f5-b5bf8bd579c8"). InnerVolumeSpecName "kube-api-access-xsvgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:22:46 crc kubenswrapper[4698]: I0224 10:22:46.477774 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/101b40a6-d373-47b1-83f5-b5bf8bd579c8-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "101b40a6-d373-47b1-83f5-b5bf8bd579c8" (UID: "101b40a6-d373-47b1-83f5-b5bf8bd579c8"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:22:46 crc kubenswrapper[4698]: I0224 10:22:46.516649 4698 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-85cb469ccf-nndrr" Feb 24 10:22:46 crc kubenswrapper[4698]: I0224 10:22:46.573499 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e266bb2f-40eb-4da2-9767-0a300c8dc27b-proxy-ca-bundles\") pod \"e266bb2f-40eb-4da2-9767-0a300c8dc27b\" (UID: \"e266bb2f-40eb-4da2-9767-0a300c8dc27b\") " Feb 24 10:22:46 crc kubenswrapper[4698]: I0224 10:22:46.573571 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e266bb2f-40eb-4da2-9767-0a300c8dc27b-serving-cert\") pod \"e266bb2f-40eb-4da2-9767-0a300c8dc27b\" (UID: \"e266bb2f-40eb-4da2-9767-0a300c8dc27b\") " Feb 24 10:22:46 crc kubenswrapper[4698]: I0224 10:22:46.573598 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e266bb2f-40eb-4da2-9767-0a300c8dc27b-config\") pod \"e266bb2f-40eb-4da2-9767-0a300c8dc27b\" (UID: \"e266bb2f-40eb-4da2-9767-0a300c8dc27b\") " Feb 24 10:22:46 crc kubenswrapper[4698]: I0224 10:22:46.573622 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sfrc4\" (UniqueName: \"kubernetes.io/projected/e266bb2f-40eb-4da2-9767-0a300c8dc27b-kube-api-access-sfrc4\") pod \"e266bb2f-40eb-4da2-9767-0a300c8dc27b\" (UID: \"e266bb2f-40eb-4da2-9767-0a300c8dc27b\") " Feb 24 10:22:46 crc kubenswrapper[4698]: I0224 10:22:46.573718 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e266bb2f-40eb-4da2-9767-0a300c8dc27b-client-ca\") pod \"e266bb2f-40eb-4da2-9767-0a300c8dc27b\" (UID: \"e266bb2f-40eb-4da2-9767-0a300c8dc27b\") " Feb 24 10:22:46 crc kubenswrapper[4698]: I0224 10:22:46.573936 4698 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/101b40a6-d373-47b1-83f5-b5bf8bd579c8-config\") on node \"crc\" DevicePath \"\"" Feb 24 10:22:46 crc kubenswrapper[4698]: I0224 10:22:46.573950 4698 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xsvgh\" (UniqueName: \"kubernetes.io/projected/101b40a6-d373-47b1-83f5-b5bf8bd579c8-kube-api-access-xsvgh\") on node \"crc\" DevicePath \"\"" Feb 24 10:22:46 crc kubenswrapper[4698]: I0224 10:22:46.573963 4698 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/101b40a6-d373-47b1-83f5-b5bf8bd579c8-client-ca\") on node \"crc\" DevicePath \"\"" Feb 24 10:22:46 crc kubenswrapper[4698]: I0224 10:22:46.573974 4698 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/101b40a6-d373-47b1-83f5-b5bf8bd579c8-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 10:22:46 crc kubenswrapper[4698]: I0224 10:22:46.574711 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e266bb2f-40eb-4da2-9767-0a300c8dc27b-client-ca" (OuterVolumeSpecName: "client-ca") pod "e266bb2f-40eb-4da2-9767-0a300c8dc27b" (UID: "e266bb2f-40eb-4da2-9767-0a300c8dc27b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:22:46 crc kubenswrapper[4698]: I0224 10:22:46.574719 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e266bb2f-40eb-4da2-9767-0a300c8dc27b-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "e266bb2f-40eb-4da2-9767-0a300c8dc27b" (UID: "e266bb2f-40eb-4da2-9767-0a300c8dc27b"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:22:46 crc kubenswrapper[4698]: I0224 10:22:46.575169 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e266bb2f-40eb-4da2-9767-0a300c8dc27b-config" (OuterVolumeSpecName: "config") pod "e266bb2f-40eb-4da2-9767-0a300c8dc27b" (UID: "e266bb2f-40eb-4da2-9767-0a300c8dc27b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:22:46 crc kubenswrapper[4698]: I0224 10:22:46.576679 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e266bb2f-40eb-4da2-9767-0a300c8dc27b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e266bb2f-40eb-4da2-9767-0a300c8dc27b" (UID: "e266bb2f-40eb-4da2-9767-0a300c8dc27b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:22:46 crc kubenswrapper[4698]: I0224 10:22:46.577024 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e266bb2f-40eb-4da2-9767-0a300c8dc27b-kube-api-access-sfrc4" (OuterVolumeSpecName: "kube-api-access-sfrc4") pod "e266bb2f-40eb-4da2-9767-0a300c8dc27b" (UID: "e266bb2f-40eb-4da2-9767-0a300c8dc27b"). InnerVolumeSpecName "kube-api-access-sfrc4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:22:46 crc kubenswrapper[4698]: I0224 10:22:46.675140 4698 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e266bb2f-40eb-4da2-9767-0a300c8dc27b-config\") on node \"crc\" DevicePath \"\"" Feb 24 10:22:46 crc kubenswrapper[4698]: I0224 10:22:46.675484 4698 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sfrc4\" (UniqueName: \"kubernetes.io/projected/e266bb2f-40eb-4da2-9767-0a300c8dc27b-kube-api-access-sfrc4\") on node \"crc\" DevicePath \"\"" Feb 24 10:22:46 crc kubenswrapper[4698]: I0224 10:22:46.675497 4698 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e266bb2f-40eb-4da2-9767-0a300c8dc27b-client-ca\") on node \"crc\" DevicePath \"\"" Feb 24 10:22:46 crc kubenswrapper[4698]: I0224 10:22:46.675508 4698 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e266bb2f-40eb-4da2-9767-0a300c8dc27b-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 24 10:22:46 crc kubenswrapper[4698]: I0224 10:22:46.675518 4698 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e266bb2f-40eb-4da2-9767-0a300c8dc27b-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 10:22:46 crc kubenswrapper[4698]: I0224 10:22:46.887302 4698 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6ddb5cdb58-fw95x"] Feb 24 10:22:46 crc kubenswrapper[4698]: E0224 10:22:46.887509 4698 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19022af1-394c-4aab-9eb1-ffb0f566d0ac" containerName="extract-utilities" Feb 24 10:22:46 crc kubenswrapper[4698]: I0224 10:22:46.887524 4698 state_mem.go:107] "Deleted CPUSet assignment" podUID="19022af1-394c-4aab-9eb1-ffb0f566d0ac" containerName="extract-utilities" Feb 24 10:22:46 crc kubenswrapper[4698]: E0224 10:22:46.887537 4698 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e266bb2f-40eb-4da2-9767-0a300c8dc27b" containerName="controller-manager" Feb 24 10:22:46 crc kubenswrapper[4698]: I0224 10:22:46.887544 4698 state_mem.go:107] "Deleted CPUSet assignment" podUID="e266bb2f-40eb-4da2-9767-0a300c8dc27b" containerName="controller-manager" Feb 24 10:22:46 crc kubenswrapper[4698]: E0224 10:22:46.887557 4698 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2eee2a16-171b-402e-9549-3d14cb56cddc" containerName="registry-server" Feb 24 10:22:46 crc kubenswrapper[4698]: I0224 10:22:46.887566 4698 state_mem.go:107] "Deleted CPUSet assignment" podUID="2eee2a16-171b-402e-9549-3d14cb56cddc" containerName="registry-server" Feb 24 10:22:46 crc kubenswrapper[4698]: E0224 10:22:46.887579 4698 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2eee2a16-171b-402e-9549-3d14cb56cddc" containerName="extract-utilities" Feb 24 10:22:46 crc kubenswrapper[4698]: I0224 10:22:46.887586 4698 state_mem.go:107] "Deleted CPUSet assignment" podUID="2eee2a16-171b-402e-9549-3d14cb56cddc" containerName="extract-utilities" Feb 24 10:22:46 crc kubenswrapper[4698]: E0224 10:22:46.887595 4698 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19022af1-394c-4aab-9eb1-ffb0f566d0ac" containerName="registry-server" Feb 24 10:22:46 crc kubenswrapper[4698]: I0224 10:22:46.887603 4698 state_mem.go:107] "Deleted CPUSet assignment" podUID="19022af1-394c-4aab-9eb1-ffb0f566d0ac" containerName="registry-server" Feb 24 10:22:46 crc kubenswrapper[4698]: E0224 10:22:46.887611 4698 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2eee2a16-171b-402e-9549-3d14cb56cddc" containerName="extract-content" Feb 24 10:22:46 crc kubenswrapper[4698]: I0224 10:22:46.887618 4698 state_mem.go:107] "Deleted CPUSet assignment" podUID="2eee2a16-171b-402e-9549-3d14cb56cddc" containerName="extract-content" Feb 24 10:22:46 crc kubenswrapper[4698]: E0224 10:22:46.887627 4698 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0de08e0-63c0-4a90-a264-1bc41b8746d8" containerName="marketplace-operator" Feb 24 10:22:46 crc kubenswrapper[4698]: I0224 10:22:46.887634 4698 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0de08e0-63c0-4a90-a264-1bc41b8746d8" containerName="marketplace-operator" Feb 24 10:22:46 crc kubenswrapper[4698]: E0224 10:22:46.887642 4698 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="101b40a6-d373-47b1-83f5-b5bf8bd579c8" containerName="route-controller-manager" Feb 24 10:22:46 crc kubenswrapper[4698]: I0224 10:22:46.887649 4698 state_mem.go:107] "Deleted CPUSet assignment" podUID="101b40a6-d373-47b1-83f5-b5bf8bd579c8" containerName="route-controller-manager" Feb 24 10:22:46 crc kubenswrapper[4698]: E0224 10:22:46.887661 4698 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f1af873-5e8f-4f75-81c2-c9b26ee37f2a" containerName="extract-utilities" Feb 24 10:22:46 crc kubenswrapper[4698]: I0224 10:22:46.887668 4698 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f1af873-5e8f-4f75-81c2-c9b26ee37f2a" containerName="extract-utilities" Feb 24 10:22:46 crc kubenswrapper[4698]: E0224 10:22:46.887682 4698 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f1af873-5e8f-4f75-81c2-c9b26ee37f2a" containerName="extract-content" Feb 24 10:22:46 crc kubenswrapper[4698]: I0224 10:22:46.887690 4698 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f1af873-5e8f-4f75-81c2-c9b26ee37f2a" containerName="extract-content" Feb 24 10:22:46 crc kubenswrapper[4698]: E0224 10:22:46.887699 4698 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19022af1-394c-4aab-9eb1-ffb0f566d0ac" containerName="extract-content" Feb 24 10:22:46 crc kubenswrapper[4698]: I0224 10:22:46.887706 4698 state_mem.go:107] "Deleted CPUSet assignment" podUID="19022af1-394c-4aab-9eb1-ffb0f566d0ac" containerName="extract-content" Feb 24 10:22:46 crc kubenswrapper[4698]: E0224 10:22:46.887717 4698 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f1af873-5e8f-4f75-81c2-c9b26ee37f2a" containerName="registry-server" Feb 24 10:22:46 crc kubenswrapper[4698]: I0224 10:22:46.887723 4698 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f1af873-5e8f-4f75-81c2-c9b26ee37f2a" containerName="registry-server" Feb 24 10:22:46 crc kubenswrapper[4698]: I0224 10:22:46.887815 4698 memory_manager.go:354] "RemoveStaleState removing state" podUID="e266bb2f-40eb-4da2-9767-0a300c8dc27b" containerName="controller-manager" Feb 24 10:22:46 crc kubenswrapper[4698]: I0224 10:22:46.887827 4698 memory_manager.go:354] "RemoveStaleState removing state" podUID="19022af1-394c-4aab-9eb1-ffb0f566d0ac" containerName="registry-server" Feb 24 10:22:46 crc kubenswrapper[4698]: I0224 10:22:46.887836 4698 memory_manager.go:354] "RemoveStaleState removing state" podUID="2eee2a16-171b-402e-9549-3d14cb56cddc" containerName="registry-server" Feb 24 10:22:46 crc kubenswrapper[4698]: I0224 10:22:46.887846 4698 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f1af873-5e8f-4f75-81c2-c9b26ee37f2a" containerName="registry-server" Feb 24 10:22:46 crc kubenswrapper[4698]: I0224 10:22:46.887860 4698 memory_manager.go:354] "RemoveStaleState removing state" podUID="101b40a6-d373-47b1-83f5-b5bf8bd579c8" containerName="route-controller-manager" Feb 24 10:22:46 crc kubenswrapper[4698]: I0224 10:22:46.887868 4698 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0de08e0-63c0-4a90-a264-1bc41b8746d8" containerName="marketplace-operator" Feb 24 10:22:46 crc kubenswrapper[4698]: I0224 10:22:46.888285 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6ddb5cdb58-fw95x" Feb 24 10:22:46 crc kubenswrapper[4698]: I0224 10:22:46.889973 4698 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-59cb84db4b-sd7j5"] Feb 24 10:22:46 crc kubenswrapper[4698]: I0224 10:22:46.890544 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-59cb84db4b-sd7j5" Feb 24 10:22:46 crc kubenswrapper[4698]: I0224 10:22:46.898504 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-59cb84db4b-sd7j5"] Feb 24 10:22:46 crc kubenswrapper[4698]: I0224 10:22:46.903476 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6ddb5cdb58-fw95x"] Feb 24 10:22:46 crc kubenswrapper[4698]: I0224 10:22:46.912404 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-85cb469ccf-nndrr" event={"ID":"e266bb2f-40eb-4da2-9767-0a300c8dc27b","Type":"ContainerDied","Data":"95c7b5dc8f8dad85fb79a0f538f0a865d9eb63f252754474cf9955cba0a04061"} Feb 24 10:22:46 crc kubenswrapper[4698]: I0224 10:22:46.912459 4698 scope.go:117] "RemoveContainer" containerID="ae84a11824584df4e5d291b39c126f5906f6363108f41233ce773636ee70284e" Feb 24 10:22:46 crc kubenswrapper[4698]: I0224 10:22:46.912550 4698 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-85cb469ccf-nndrr" Feb 24 10:22:46 crc kubenswrapper[4698]: I0224 10:22:46.917359 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zdlkh" event={"ID":"d21d0539-1e5b-4c6a-b35b-49effbb595b1","Type":"ContainerStarted","Data":"201af87f820df539737bad9dbf2d0895bf5e68136835bd787407b037ad75280f"} Feb 24 10:22:46 crc kubenswrapper[4698]: I0224 10:22:46.917424 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-zdlkh" event={"ID":"d21d0539-1e5b-4c6a-b35b-49effbb595b1","Type":"ContainerStarted","Data":"96dccc77b087bfa844f19cc3ccda10a25f546d9ba87978911972ff4f5cd84dbc"} Feb 24 10:22:46 crc kubenswrapper[4698]: I0224 10:22:46.917742 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-zdlkh" Feb 24 10:22:46 crc kubenswrapper[4698]: I0224 10:22:46.922144 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-zdlkh" Feb 24 10:22:46 crc kubenswrapper[4698]: I0224 10:22:46.929983 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-695f7b9b5-pqzw6" event={"ID":"101b40a6-d373-47b1-83f5-b5bf8bd579c8","Type":"ContainerDied","Data":"93a4a205a6550ca196cdbfb82c857cb9872fe345e65955afd7705ae159354271"} Feb 24 10:22:46 crc kubenswrapper[4698]: I0224 10:22:46.930066 4698 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-695f7b9b5-pqzw6" Feb 24 10:22:46 crc kubenswrapper[4698]: I0224 10:22:46.934494 4698 scope.go:117] "RemoveContainer" containerID="2c1907d8cbed72fda1c1b6eb1876b793cc232f9ae5d9a1a1d1146c8ac68bcc9e" Feb 24 10:22:46 crc kubenswrapper[4698]: I0224 10:22:46.962720 4698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-zdlkh" podStartSLOduration=1.962697779 podStartE2EDuration="1.962697779s" podCreationTimestamp="2026-02-24 10:22:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:22:46.958888597 +0000 UTC m=+392.072502858" watchObservedRunningTime="2026-02-24 10:22:46.962697779 +0000 UTC m=+392.076312020" Feb 24 10:22:46 crc kubenswrapper[4698]: I0224 10:22:46.988373 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8s76v\" (UniqueName: \"kubernetes.io/projected/e67b79e2-0566-4e61-9489-d9fcd311e379-kube-api-access-8s76v\") pod \"controller-manager-59cb84db4b-sd7j5\" (UID: \"e67b79e2-0566-4e61-9489-d9fcd311e379\") " pod="openshift-controller-manager/controller-manager-59cb84db4b-sd7j5" Feb 24 10:22:46 crc kubenswrapper[4698]: I0224 10:22:46.988432 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d3d97e1-c75a-47ba-b524-2c2bd3b3acd8-serving-cert\") pod \"route-controller-manager-6ddb5cdb58-fw95x\" (UID: \"1d3d97e1-c75a-47ba-b524-2c2bd3b3acd8\") " pod="openshift-route-controller-manager/route-controller-manager-6ddb5cdb58-fw95x" Feb 24 10:22:46 crc kubenswrapper[4698]: I0224 10:22:46.988473 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klw9c\" (UniqueName: \"kubernetes.io/projected/1d3d97e1-c75a-47ba-b524-2c2bd3b3acd8-kube-api-access-klw9c\") pod \"route-controller-manager-6ddb5cdb58-fw95x\" (UID: \"1d3d97e1-c75a-47ba-b524-2c2bd3b3acd8\") " pod="openshift-route-controller-manager/route-controller-manager-6ddb5cdb58-fw95x" Feb 24 10:22:46 crc kubenswrapper[4698]: I0224 10:22:46.988538 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e67b79e2-0566-4e61-9489-d9fcd311e379-serving-cert\") pod \"controller-manager-59cb84db4b-sd7j5\" (UID: \"e67b79e2-0566-4e61-9489-d9fcd311e379\") " pod="openshift-controller-manager/controller-manager-59cb84db4b-sd7j5" Feb 24 10:22:46 crc kubenswrapper[4698]: I0224 10:22:46.988567 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e67b79e2-0566-4e61-9489-d9fcd311e379-client-ca\") pod \"controller-manager-59cb84db4b-sd7j5\" (UID: \"e67b79e2-0566-4e61-9489-d9fcd311e379\") " pod="openshift-controller-manager/controller-manager-59cb84db4b-sd7j5" Feb 24 10:22:46 crc kubenswrapper[4698]: I0224 10:22:46.988596 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d3d97e1-c75a-47ba-b524-2c2bd3b3acd8-config\") pod \"route-controller-manager-6ddb5cdb58-fw95x\" (UID: \"1d3d97e1-c75a-47ba-b524-2c2bd3b3acd8\") " pod="openshift-route-controller-manager/route-controller-manager-6ddb5cdb58-fw95x" Feb 24 10:22:46 crc kubenswrapper[4698]: I0224 10:22:46.988646 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e67b79e2-0566-4e61-9489-d9fcd311e379-config\") pod \"controller-manager-59cb84db4b-sd7j5\" (UID: \"e67b79e2-0566-4e61-9489-d9fcd311e379\") " pod="openshift-controller-manager/controller-manager-59cb84db4b-sd7j5" Feb 24 10:22:46 crc kubenswrapper[4698]: I0224 10:22:46.988689 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e67b79e2-0566-4e61-9489-d9fcd311e379-proxy-ca-bundles\") pod \"controller-manager-59cb84db4b-sd7j5\" (UID: \"e67b79e2-0566-4e61-9489-d9fcd311e379\") " pod="openshift-controller-manager/controller-manager-59cb84db4b-sd7j5" Feb 24 10:22:46 crc kubenswrapper[4698]: I0224 10:22:46.988784 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1d3d97e1-c75a-47ba-b524-2c2bd3b3acd8-client-ca\") pod \"route-controller-manager-6ddb5cdb58-fw95x\" (UID: \"1d3d97e1-c75a-47ba-b524-2c2bd3b3acd8\") " pod="openshift-route-controller-manager/route-controller-manager-6ddb5cdb58-fw95x" Feb 24 10:22:46 crc kubenswrapper[4698]: I0224 10:22:46.990385 4698 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-695f7b9b5-pqzw6"] Feb 24 10:22:46 crc kubenswrapper[4698]: I0224 10:22:46.995909 4698 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-695f7b9b5-pqzw6"] Feb 24 10:22:47 crc kubenswrapper[4698]: I0224 10:22:47.002591 4698 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-85cb469ccf-nndrr"] Feb 24 10:22:47 crc kubenswrapper[4698]: I0224 10:22:47.007575 4698 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-85cb469ccf-nndrr"] Feb 24 10:22:47 crc kubenswrapper[4698]: I0224 10:22:47.090141 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e67b79e2-0566-4e61-9489-d9fcd311e379-proxy-ca-bundles\") pod \"controller-manager-59cb84db4b-sd7j5\" (UID: \"e67b79e2-0566-4e61-9489-d9fcd311e379\") " pod="openshift-controller-manager/controller-manager-59cb84db4b-sd7j5" Feb 24 10:22:47 crc kubenswrapper[4698]: I0224 10:22:47.090205 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1d3d97e1-c75a-47ba-b524-2c2bd3b3acd8-client-ca\") pod \"route-controller-manager-6ddb5cdb58-fw95x\" (UID: \"1d3d97e1-c75a-47ba-b524-2c2bd3b3acd8\") " pod="openshift-route-controller-manager/route-controller-manager-6ddb5cdb58-fw95x" Feb 24 10:22:47 crc kubenswrapper[4698]: I0224 10:22:47.090236 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8s76v\" (UniqueName: \"kubernetes.io/projected/e67b79e2-0566-4e61-9489-d9fcd311e379-kube-api-access-8s76v\") pod \"controller-manager-59cb84db4b-sd7j5\" (UID: \"e67b79e2-0566-4e61-9489-d9fcd311e379\") " pod="openshift-controller-manager/controller-manager-59cb84db4b-sd7j5" Feb 24 10:22:47 crc kubenswrapper[4698]: I0224 10:22:47.090277 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d3d97e1-c75a-47ba-b524-2c2bd3b3acd8-serving-cert\") pod \"route-controller-manager-6ddb5cdb58-fw95x\" (UID: \"1d3d97e1-c75a-47ba-b524-2c2bd3b3acd8\") " pod="openshift-route-controller-manager/route-controller-manager-6ddb5cdb58-fw95x" Feb 24 10:22:47 crc kubenswrapper[4698]: I0224 10:22:47.090292 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-klw9c\" (UniqueName: \"kubernetes.io/projected/1d3d97e1-c75a-47ba-b524-2c2bd3b3acd8-kube-api-access-klw9c\") pod \"route-controller-manager-6ddb5cdb58-fw95x\" (UID: \"1d3d97e1-c75a-47ba-b524-2c2bd3b3acd8\") " pod="openshift-route-controller-manager/route-controller-manager-6ddb5cdb58-fw95x" Feb 24 10:22:47 crc kubenswrapper[4698]: I0224 10:22:47.090317 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e67b79e2-0566-4e61-9489-d9fcd311e379-serving-cert\") pod \"controller-manager-59cb84db4b-sd7j5\" (UID: \"e67b79e2-0566-4e61-9489-d9fcd311e379\") " pod="openshift-controller-manager/controller-manager-59cb84db4b-sd7j5" Feb 24 10:22:47 crc kubenswrapper[4698]: I0224 10:22:47.090332 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e67b79e2-0566-4e61-9489-d9fcd311e379-client-ca\") pod \"controller-manager-59cb84db4b-sd7j5\" (UID: \"e67b79e2-0566-4e61-9489-d9fcd311e379\") " pod="openshift-controller-manager/controller-manager-59cb84db4b-sd7j5" Feb 24 10:22:47 crc kubenswrapper[4698]: I0224 10:22:47.090347 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d3d97e1-c75a-47ba-b524-2c2bd3b3acd8-config\") pod \"route-controller-manager-6ddb5cdb58-fw95x\" (UID: \"1d3d97e1-c75a-47ba-b524-2c2bd3b3acd8\") " pod="openshift-route-controller-manager/route-controller-manager-6ddb5cdb58-fw95x" Feb 24 10:22:47 crc kubenswrapper[4698]: I0224 10:22:47.090369 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e67b79e2-0566-4e61-9489-d9fcd311e379-config\") pod \"controller-manager-59cb84db4b-sd7j5\" (UID: \"e67b79e2-0566-4e61-9489-d9fcd311e379\") " pod="openshift-controller-manager/controller-manager-59cb84db4b-sd7j5" Feb 24 10:22:47 crc kubenswrapper[4698]: I0224 10:22:47.091668 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e67b79e2-0566-4e61-9489-d9fcd311e379-proxy-ca-bundles\") pod \"controller-manager-59cb84db4b-sd7j5\" (UID: \"e67b79e2-0566-4e61-9489-d9fcd311e379\") " pod="openshift-controller-manager/controller-manager-59cb84db4b-sd7j5" Feb 24 10:22:47 crc kubenswrapper[4698]: I0224 10:22:47.091697 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e67b79e2-0566-4e61-9489-d9fcd311e379-client-ca\") pod \"controller-manager-59cb84db4b-sd7j5\" (UID: \"e67b79e2-0566-4e61-9489-d9fcd311e379\") " pod="openshift-controller-manager/controller-manager-59cb84db4b-sd7j5" Feb 24 10:22:47 crc kubenswrapper[4698]: I0224 10:22:47.091736 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1d3d97e1-c75a-47ba-b524-2c2bd3b3acd8-client-ca\") pod \"route-controller-manager-6ddb5cdb58-fw95x\" (UID: \"1d3d97e1-c75a-47ba-b524-2c2bd3b3acd8\") " pod="openshift-route-controller-manager/route-controller-manager-6ddb5cdb58-fw95x" Feb 24 10:22:47 crc kubenswrapper[4698]: I0224 10:22:47.092112 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e67b79e2-0566-4e61-9489-d9fcd311e379-config\") pod \"controller-manager-59cb84db4b-sd7j5\" (UID: \"e67b79e2-0566-4e61-9489-d9fcd311e379\") " pod="openshift-controller-manager/controller-manager-59cb84db4b-sd7j5" Feb 24 10:22:47 crc kubenswrapper[4698]: I0224 10:22:47.092604 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d3d97e1-c75a-47ba-b524-2c2bd3b3acd8-config\") pod \"route-controller-manager-6ddb5cdb58-fw95x\" (UID: \"1d3d97e1-c75a-47ba-b524-2c2bd3b3acd8\") " pod="openshift-route-controller-manager/route-controller-manager-6ddb5cdb58-fw95x" Feb 24 10:22:47 crc kubenswrapper[4698]: I0224 10:22:47.095185 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d3d97e1-c75a-47ba-b524-2c2bd3b3acd8-serving-cert\") pod \"route-controller-manager-6ddb5cdb58-fw95x\" (UID: \"1d3d97e1-c75a-47ba-b524-2c2bd3b3acd8\") " pod="openshift-route-controller-manager/route-controller-manager-6ddb5cdb58-fw95x" Feb 24 10:22:47 crc kubenswrapper[4698]: I0224 10:22:47.097711 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e67b79e2-0566-4e61-9489-d9fcd311e379-serving-cert\") pod \"controller-manager-59cb84db4b-sd7j5\" (UID: \"e67b79e2-0566-4e61-9489-d9fcd311e379\") " pod="openshift-controller-manager/controller-manager-59cb84db4b-sd7j5" Feb 24 10:22:47 crc kubenswrapper[4698]: I0224 10:22:47.106984 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-klw9c\" (UniqueName: \"kubernetes.io/projected/1d3d97e1-c75a-47ba-b524-2c2bd3b3acd8-kube-api-access-klw9c\") pod \"route-controller-manager-6ddb5cdb58-fw95x\" (UID: \"1d3d97e1-c75a-47ba-b524-2c2bd3b3acd8\") " pod="openshift-route-controller-manager/route-controller-manager-6ddb5cdb58-fw95x" Feb 24 10:22:47 crc kubenswrapper[4698]: I0224 10:22:47.107017 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8s76v\" (UniqueName: \"kubernetes.io/projected/e67b79e2-0566-4e61-9489-d9fcd311e379-kube-api-access-8s76v\") pod \"controller-manager-59cb84db4b-sd7j5\" (UID: \"e67b79e2-0566-4e61-9489-d9fcd311e379\") " pod="openshift-controller-manager/controller-manager-59cb84db4b-sd7j5" Feb 24 10:22:47 crc kubenswrapper[4698]: I0224 10:22:47.208463 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6ddb5cdb58-fw95x" Feb 24 10:22:47 crc kubenswrapper[4698]: I0224 10:22:47.230791 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-59cb84db4b-sd7j5" Feb 24 10:22:47 crc kubenswrapper[4698]: I0224 10:22:47.591068 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6ddb5cdb58-fw95x"] Feb 24 10:22:47 crc kubenswrapper[4698]: I0224 10:22:47.621596 4698 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="101b40a6-d373-47b1-83f5-b5bf8bd579c8" path="/var/lib/kubelet/pods/101b40a6-d373-47b1-83f5-b5bf8bd579c8/volumes" Feb 24 10:22:47 crc kubenswrapper[4698]: I0224 10:22:47.622159 4698 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19022af1-394c-4aab-9eb1-ffb0f566d0ac" path="/var/lib/kubelet/pods/19022af1-394c-4aab-9eb1-ffb0f566d0ac/volumes" Feb 24 10:22:47 crc kubenswrapper[4698]: I0224 10:22:47.622837 4698 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25f4eaf1-6171-44dd-b225-be712a45ba1b" path="/var/lib/kubelet/pods/25f4eaf1-6171-44dd-b225-be712a45ba1b/volumes" Feb 24 10:22:47 crc kubenswrapper[4698]: I0224 10:22:47.623963 4698 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2eee2a16-171b-402e-9549-3d14cb56cddc" path="/var/lib/kubelet/pods/2eee2a16-171b-402e-9549-3d14cb56cddc/volumes" Feb 24 10:22:47 crc kubenswrapper[4698]: I0224 10:22:47.624674 4698 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5149fd4f-19d7-4852-b09a-d9909b8231dd" path="/var/lib/kubelet/pods/5149fd4f-19d7-4852-b09a-d9909b8231dd/volumes" Feb 24 10:22:47 crc kubenswrapper[4698]: I0224 10:22:47.625884 4698 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55418747-8c79-496a-9b89-68f9eaa3f01a" path="/var/lib/kubelet/pods/55418747-8c79-496a-9b89-68f9eaa3f01a/volumes" Feb 24 10:22:47 crc kubenswrapper[4698]: I0224 10:22:47.626568 4698 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f1af873-5e8f-4f75-81c2-c9b26ee37f2a" path="/var/lib/kubelet/pods/6f1af873-5e8f-4f75-81c2-c9b26ee37f2a/volumes" Feb 24 10:22:47 crc kubenswrapper[4698]: I0224 10:22:47.627289 4698 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0de08e0-63c0-4a90-a264-1bc41b8746d8" path="/var/lib/kubelet/pods/d0de08e0-63c0-4a90-a264-1bc41b8746d8/volumes" Feb 24 10:22:47 crc kubenswrapper[4698]: I0224 10:22:47.628326 4698 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e266bb2f-40eb-4da2-9767-0a300c8dc27b" path="/var/lib/kubelet/pods/e266bb2f-40eb-4da2-9767-0a300c8dc27b/volumes" Feb 24 10:22:47 crc kubenswrapper[4698]: I0224 10:22:47.652377 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-59cb84db4b-sd7j5"] Feb 24 10:22:47 crc kubenswrapper[4698]: W0224 10:22:47.659567 4698 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode67b79e2_0566_4e61_9489_d9fcd311e379.slice/crio-33bbd554acd8a30afe08c25814b7efb509f2ff9e00bf20c970ce336fe1993d9e WatchSource:0}: Error finding container 33bbd554acd8a30afe08c25814b7efb509f2ff9e00bf20c970ce336fe1993d9e: Status 404 returned error can't find the container with id 33bbd554acd8a30afe08c25814b7efb509f2ff9e00bf20c970ce336fe1993d9e Feb 24 10:22:47 crc kubenswrapper[4698]: I0224 10:22:47.946792 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-59cb84db4b-sd7j5" event={"ID":"e67b79e2-0566-4e61-9489-d9fcd311e379","Type":"ContainerStarted","Data":"33bbd554acd8a30afe08c25814b7efb509f2ff9e00bf20c970ce336fe1993d9e"} Feb 24 10:22:47 crc kubenswrapper[4698]: I0224 10:22:47.949375 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6ddb5cdb58-fw95x" event={"ID":"1d3d97e1-c75a-47ba-b524-2c2bd3b3acd8","Type":"ContainerStarted","Data":"d09713355ba0e0b974be78a315c28bba4b6660a12054a7991aaba0c390bb2c8b"} Feb 24 10:22:48 crc kubenswrapper[4698]: I0224 10:22:48.956348 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-59cb84db4b-sd7j5" event={"ID":"e67b79e2-0566-4e61-9489-d9fcd311e379","Type":"ContainerStarted","Data":"7e0039d496348db1b5cda7a8cfca8961c58e84a2a0c16a3880e2e247be6d7470"} Feb 24 10:22:48 crc kubenswrapper[4698]: I0224 10:22:48.956701 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-59cb84db4b-sd7j5" Feb 24 10:22:48 crc kubenswrapper[4698]: I0224 10:22:48.959006 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6ddb5cdb58-fw95x" event={"ID":"1d3d97e1-c75a-47ba-b524-2c2bd3b3acd8","Type":"ContainerStarted","Data":"320d083463d11930bd4f2fc8d7624053394e3f8225d255ad7ccf6efe436da410"} Feb 24 10:22:48 crc kubenswrapper[4698]: I0224 10:22:48.962061 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-59cb84db4b-sd7j5" Feb 24 10:22:48 crc kubenswrapper[4698]: I0224 10:22:48.979814 4698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-59cb84db4b-sd7j5" podStartSLOduration=3.979796442 podStartE2EDuration="3.979796442s" podCreationTimestamp="2026-02-24 10:22:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:22:48.975716864 +0000 UTC m=+394.089331125" watchObservedRunningTime="2026-02-24 10:22:48.979796442 +0000 UTC m=+394.093410683" Feb 24 10:22:48 crc kubenswrapper[4698]: I0224 10:22:48.995671 4698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6ddb5cdb58-fw95x" podStartSLOduration=3.995648956 podStartE2EDuration="3.995648956s" podCreationTimestamp="2026-02-24 10:22:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:22:48.99170922 +0000 UTC m=+394.105323481" watchObservedRunningTime="2026-02-24 10:22:48.995648956 +0000 UTC m=+394.109263207" Feb 24 10:22:49 crc kubenswrapper[4698]: I0224 10:22:49.426872 4698 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-59cb84db4b-sd7j5"] Feb 24 10:22:49 crc kubenswrapper[4698]: I0224 10:22:49.438080 4698 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6ddb5cdb58-fw95x"] Feb 24 10:22:49 crc kubenswrapper[4698]: I0224 10:22:49.964679 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6ddb5cdb58-fw95x" Feb 24 10:22:49 crc kubenswrapper[4698]: I0224 10:22:49.974749 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6ddb5cdb58-fw95x" Feb 24 10:22:50 crc kubenswrapper[4698]: I0224 10:22:50.970813 4698 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6ddb5cdb58-fw95x" podUID="1d3d97e1-c75a-47ba-b524-2c2bd3b3acd8" containerName="route-controller-manager" containerID="cri-o://320d083463d11930bd4f2fc8d7624053394e3f8225d255ad7ccf6efe436da410" gracePeriod=30 Feb 24 10:22:50 crc kubenswrapper[4698]: I0224 10:22:50.971332 4698 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-59cb84db4b-sd7j5" podUID="e67b79e2-0566-4e61-9489-d9fcd311e379" containerName="controller-manager" containerID="cri-o://7e0039d496348db1b5cda7a8cfca8961c58e84a2a0c16a3880e2e247be6d7470" gracePeriod=30 Feb 24 10:22:51 crc kubenswrapper[4698]: I0224 10:22:51.392568 4698 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6ddb5cdb58-fw95x" Feb 24 10:22:51 crc kubenswrapper[4698]: I0224 10:22:51.427388 4698 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-69c79dd4cc-qvzmz"] Feb 24 10:22:51 crc kubenswrapper[4698]: E0224 10:22:51.427751 4698 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d3d97e1-c75a-47ba-b524-2c2bd3b3acd8" containerName="route-controller-manager" Feb 24 10:22:51 crc kubenswrapper[4698]: I0224 10:22:51.427766 4698 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d3d97e1-c75a-47ba-b524-2c2bd3b3acd8" containerName="route-controller-manager" Feb 24 10:22:51 crc kubenswrapper[4698]: I0224 10:22:51.428069 4698 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d3d97e1-c75a-47ba-b524-2c2bd3b3acd8" containerName="route-controller-manager" Feb 24 10:22:51 crc kubenswrapper[4698]: I0224 10:22:51.428665 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-69c79dd4cc-qvzmz" Feb 24 10:22:51 crc kubenswrapper[4698]: I0224 10:22:51.436178 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-69c79dd4cc-qvzmz"] Feb 24 10:22:51 crc kubenswrapper[4698]: I0224 10:22:51.514578 4698 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-59cb84db4b-sd7j5" Feb 24 10:22:51 crc kubenswrapper[4698]: I0224 10:22:51.541854 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-klw9c\" (UniqueName: \"kubernetes.io/projected/1d3d97e1-c75a-47ba-b524-2c2bd3b3acd8-kube-api-access-klw9c\") pod \"1d3d97e1-c75a-47ba-b524-2c2bd3b3acd8\" (UID: \"1d3d97e1-c75a-47ba-b524-2c2bd3b3acd8\") " Feb 24 10:22:51 crc kubenswrapper[4698]: I0224 10:22:51.542086 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1d3d97e1-c75a-47ba-b524-2c2bd3b3acd8-client-ca\") pod \"1d3d97e1-c75a-47ba-b524-2c2bd3b3acd8\" (UID: \"1d3d97e1-c75a-47ba-b524-2c2bd3b3acd8\") " Feb 24 10:22:51 crc kubenswrapper[4698]: I0224 10:22:51.542185 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e67b79e2-0566-4e61-9489-d9fcd311e379-config\") pod \"e67b79e2-0566-4e61-9489-d9fcd311e379\" (UID: \"e67b79e2-0566-4e61-9489-d9fcd311e379\") " Feb 24 10:22:51 crc kubenswrapper[4698]: I0224 10:22:51.542224 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8s76v\" (UniqueName: \"kubernetes.io/projected/e67b79e2-0566-4e61-9489-d9fcd311e379-kube-api-access-8s76v\") pod \"e67b79e2-0566-4e61-9489-d9fcd311e379\" (UID: \"e67b79e2-0566-4e61-9489-d9fcd311e379\") " Feb 24 10:22:51 crc kubenswrapper[4698]: I0224 10:22:51.542247 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e67b79e2-0566-4e61-9489-d9fcd311e379-proxy-ca-bundles\") pod \"e67b79e2-0566-4e61-9489-d9fcd311e379\" (UID: \"e67b79e2-0566-4e61-9489-d9fcd311e379\") " Feb 24 10:22:51 crc kubenswrapper[4698]: I0224 10:22:51.542318 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d3d97e1-c75a-47ba-b524-2c2bd3b3acd8-serving-cert\") pod \"1d3d97e1-c75a-47ba-b524-2c2bd3b3acd8\" (UID: \"1d3d97e1-c75a-47ba-b524-2c2bd3b3acd8\") " Feb 24 10:22:51 crc kubenswrapper[4698]: I0224 10:22:51.542341 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d3d97e1-c75a-47ba-b524-2c2bd3b3acd8-config\") pod \"1d3d97e1-c75a-47ba-b524-2c2bd3b3acd8\" (UID: \"1d3d97e1-c75a-47ba-b524-2c2bd3b3acd8\") " Feb 24 10:22:51 crc kubenswrapper[4698]: I0224 10:22:51.542502 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c6f37d6-c4a7-48ba-bb37-18128d69c384-serving-cert\") pod \"route-controller-manager-69c79dd4cc-qvzmz\" (UID: \"9c6f37d6-c4a7-48ba-bb37-18128d69c384\") " pod="openshift-route-controller-manager/route-controller-manager-69c79dd4cc-qvzmz" Feb 24 10:22:51 crc kubenswrapper[4698]: I0224 10:22:51.542588 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c6f37d6-c4a7-48ba-bb37-18128d69c384-config\") pod \"route-controller-manager-69c79dd4cc-qvzmz\" (UID: \"9c6f37d6-c4a7-48ba-bb37-18128d69c384\") " pod="openshift-route-controller-manager/route-controller-manager-69c79dd4cc-qvzmz" Feb 24 10:22:51 crc kubenswrapper[4698]: I0224 10:22:51.542629 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9c6f37d6-c4a7-48ba-bb37-18128d69c384-client-ca\") pod \"route-controller-manager-69c79dd4cc-qvzmz\" (UID: \"9c6f37d6-c4a7-48ba-bb37-18128d69c384\") " pod="openshift-route-controller-manager/route-controller-manager-69c79dd4cc-qvzmz" Feb 24 10:22:51 crc kubenswrapper[4698]: I0224 10:22:51.542658 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7qgn\" (UniqueName: \"kubernetes.io/projected/9c6f37d6-c4a7-48ba-bb37-18128d69c384-kube-api-access-j7qgn\") pod \"route-controller-manager-69c79dd4cc-qvzmz\" (UID: \"9c6f37d6-c4a7-48ba-bb37-18128d69c384\") " pod="openshift-route-controller-manager/route-controller-manager-69c79dd4cc-qvzmz" Feb 24 10:22:51 crc kubenswrapper[4698]: I0224 10:22:51.542900 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d3d97e1-c75a-47ba-b524-2c2bd3b3acd8-client-ca" (OuterVolumeSpecName: "client-ca") pod "1d3d97e1-c75a-47ba-b524-2c2bd3b3acd8" (UID: "1d3d97e1-c75a-47ba-b524-2c2bd3b3acd8"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:22:51 crc kubenswrapper[4698]: I0224 10:22:51.543081 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e67b79e2-0566-4e61-9489-d9fcd311e379-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "e67b79e2-0566-4e61-9489-d9fcd311e379" (UID: "e67b79e2-0566-4e61-9489-d9fcd311e379"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:22:51 crc kubenswrapper[4698]: I0224 10:22:51.543240 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d3d97e1-c75a-47ba-b524-2c2bd3b3acd8-config" (OuterVolumeSpecName: "config") pod "1d3d97e1-c75a-47ba-b524-2c2bd3b3acd8" (UID: "1d3d97e1-c75a-47ba-b524-2c2bd3b3acd8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:22:51 crc kubenswrapper[4698]: I0224 10:22:51.543298 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e67b79e2-0566-4e61-9489-d9fcd311e379-config" (OuterVolumeSpecName: "config") pod "e67b79e2-0566-4e61-9489-d9fcd311e379" (UID: "e67b79e2-0566-4e61-9489-d9fcd311e379"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:22:51 crc kubenswrapper[4698]: I0224 10:22:51.547307 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e67b79e2-0566-4e61-9489-d9fcd311e379-kube-api-access-8s76v" (OuterVolumeSpecName: "kube-api-access-8s76v") pod "e67b79e2-0566-4e61-9489-d9fcd311e379" (UID: "e67b79e2-0566-4e61-9489-d9fcd311e379"). InnerVolumeSpecName "kube-api-access-8s76v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:22:51 crc kubenswrapper[4698]: I0224 10:22:51.547325 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d3d97e1-c75a-47ba-b524-2c2bd3b3acd8-kube-api-access-klw9c" (OuterVolumeSpecName: "kube-api-access-klw9c") pod "1d3d97e1-c75a-47ba-b524-2c2bd3b3acd8" (UID: "1d3d97e1-c75a-47ba-b524-2c2bd3b3acd8"). InnerVolumeSpecName "kube-api-access-klw9c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:22:51 crc kubenswrapper[4698]: I0224 10:22:51.548337 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d3d97e1-c75a-47ba-b524-2c2bd3b3acd8-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1d3d97e1-c75a-47ba-b524-2c2bd3b3acd8" (UID: "1d3d97e1-c75a-47ba-b524-2c2bd3b3acd8"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:22:51 crc kubenswrapper[4698]: I0224 10:22:51.643151 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e67b79e2-0566-4e61-9489-d9fcd311e379-client-ca\") pod \"e67b79e2-0566-4e61-9489-d9fcd311e379\" (UID: \"e67b79e2-0566-4e61-9489-d9fcd311e379\") " Feb 24 10:22:51 crc kubenswrapper[4698]: I0224 10:22:51.643191 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e67b79e2-0566-4e61-9489-d9fcd311e379-serving-cert\") pod \"e67b79e2-0566-4e61-9489-d9fcd311e379\" (UID: \"e67b79e2-0566-4e61-9489-d9fcd311e379\") " Feb 24 10:22:51 crc kubenswrapper[4698]: I0224 10:22:51.643302 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c6f37d6-c4a7-48ba-bb37-18128d69c384-serving-cert\") pod \"route-controller-manager-69c79dd4cc-qvzmz\" (UID: \"9c6f37d6-c4a7-48ba-bb37-18128d69c384\") " pod="openshift-route-controller-manager/route-controller-manager-69c79dd4cc-qvzmz" Feb 24 10:22:51 crc kubenswrapper[4698]: I0224 10:22:51.643350 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c6f37d6-c4a7-48ba-bb37-18128d69c384-config\") pod \"route-controller-manager-69c79dd4cc-qvzmz\" (UID: \"9c6f37d6-c4a7-48ba-bb37-18128d69c384\") " pod="openshift-route-controller-manager/route-controller-manager-69c79dd4cc-qvzmz" Feb 24 10:22:51 crc kubenswrapper[4698]: I0224 10:22:51.643375 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9c6f37d6-c4a7-48ba-bb37-18128d69c384-client-ca\") pod \"route-controller-manager-69c79dd4cc-qvzmz\" (UID: \"9c6f37d6-c4a7-48ba-bb37-18128d69c384\") " pod="openshift-route-controller-manager/route-controller-manager-69c79dd4cc-qvzmz" Feb 24 10:22:51 crc kubenswrapper[4698]: I0224 10:22:51.643401 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7qgn\" (UniqueName: \"kubernetes.io/projected/9c6f37d6-c4a7-48ba-bb37-18128d69c384-kube-api-access-j7qgn\") pod \"route-controller-manager-69c79dd4cc-qvzmz\" (UID: \"9c6f37d6-c4a7-48ba-bb37-18128d69c384\") " pod="openshift-route-controller-manager/route-controller-manager-69c79dd4cc-qvzmz" Feb 24 10:22:51 crc kubenswrapper[4698]: I0224 10:22:51.643463 4698 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e67b79e2-0566-4e61-9489-d9fcd311e379-config\") on node \"crc\" DevicePath \"\"" Feb 24 10:22:51 crc kubenswrapper[4698]: I0224 10:22:51.643476 4698 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8s76v\" (UniqueName: \"kubernetes.io/projected/e67b79e2-0566-4e61-9489-d9fcd311e379-kube-api-access-8s76v\") on node \"crc\" DevicePath \"\"" Feb 24 10:22:51 crc kubenswrapper[4698]: I0224 10:22:51.643489 4698 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e67b79e2-0566-4e61-9489-d9fcd311e379-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 24 10:22:51 crc kubenswrapper[4698]: I0224 10:22:51.643501 4698 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d3d97e1-c75a-47ba-b524-2c2bd3b3acd8-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 10:22:51 crc kubenswrapper[4698]: I0224 10:22:51.643512 4698 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d3d97e1-c75a-47ba-b524-2c2bd3b3acd8-config\") on node \"crc\" DevicePath \"\"" Feb 24 10:22:51 crc kubenswrapper[4698]: I0224 10:22:51.643526 4698 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-klw9c\" (UniqueName: \"kubernetes.io/projected/1d3d97e1-c75a-47ba-b524-2c2bd3b3acd8-kube-api-access-klw9c\") on node \"crc\" DevicePath \"\"" Feb 24 10:22:51 crc kubenswrapper[4698]: I0224 10:22:51.643537 4698 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1d3d97e1-c75a-47ba-b524-2c2bd3b3acd8-client-ca\") on node \"crc\" DevicePath \"\"" Feb 24 10:22:51 crc kubenswrapper[4698]: I0224 10:22:51.643861 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e67b79e2-0566-4e61-9489-d9fcd311e379-client-ca" (OuterVolumeSpecName: "client-ca") pod "e67b79e2-0566-4e61-9489-d9fcd311e379" (UID: "e67b79e2-0566-4e61-9489-d9fcd311e379"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:22:51 crc kubenswrapper[4698]: I0224 10:22:51.644599 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9c6f37d6-c4a7-48ba-bb37-18128d69c384-client-ca\") pod \"route-controller-manager-69c79dd4cc-qvzmz\" (UID: \"9c6f37d6-c4a7-48ba-bb37-18128d69c384\") " pod="openshift-route-controller-manager/route-controller-manager-69c79dd4cc-qvzmz" Feb 24 10:22:51 crc kubenswrapper[4698]: I0224 10:22:51.644759 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c6f37d6-c4a7-48ba-bb37-18128d69c384-config\") pod \"route-controller-manager-69c79dd4cc-qvzmz\" (UID: \"9c6f37d6-c4a7-48ba-bb37-18128d69c384\") " pod="openshift-route-controller-manager/route-controller-manager-69c79dd4cc-qvzmz" Feb 24 10:22:51 crc kubenswrapper[4698]: I0224 10:22:51.646490 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e67b79e2-0566-4e61-9489-d9fcd311e379-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e67b79e2-0566-4e61-9489-d9fcd311e379" (UID: "e67b79e2-0566-4e61-9489-d9fcd311e379"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:22:51 crc kubenswrapper[4698]: I0224 10:22:51.647146 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c6f37d6-c4a7-48ba-bb37-18128d69c384-serving-cert\") pod \"route-controller-manager-69c79dd4cc-qvzmz\" (UID: \"9c6f37d6-c4a7-48ba-bb37-18128d69c384\") " pod="openshift-route-controller-manager/route-controller-manager-69c79dd4cc-qvzmz" Feb 24 10:22:51 crc kubenswrapper[4698]: I0224 10:22:51.663929 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7qgn\" (UniqueName: \"kubernetes.io/projected/9c6f37d6-c4a7-48ba-bb37-18128d69c384-kube-api-access-j7qgn\") pod \"route-controller-manager-69c79dd4cc-qvzmz\" (UID: \"9c6f37d6-c4a7-48ba-bb37-18128d69c384\") " pod="openshift-route-controller-manager/route-controller-manager-69c79dd4cc-qvzmz" Feb 24 10:22:51 crc kubenswrapper[4698]: I0224 10:22:51.744171 4698 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e67b79e2-0566-4e61-9489-d9fcd311e379-client-ca\") on node \"crc\" DevicePath \"\"" Feb 24 10:22:51 crc kubenswrapper[4698]: I0224 10:22:51.744218 4698 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e67b79e2-0566-4e61-9489-d9fcd311e379-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 10:22:51 crc kubenswrapper[4698]: I0224 10:22:51.763996 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-69c79dd4cc-qvzmz" Feb 24 10:22:51 crc kubenswrapper[4698]: I0224 10:22:51.975745 4698 generic.go:334] "Generic (PLEG): container finished" podID="1d3d97e1-c75a-47ba-b524-2c2bd3b3acd8" containerID="320d083463d11930bd4f2fc8d7624053394e3f8225d255ad7ccf6efe436da410" exitCode=0 Feb 24 10:22:51 crc kubenswrapper[4698]: I0224 10:22:51.975805 4698 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6ddb5cdb58-fw95x" Feb 24 10:22:51 crc kubenswrapper[4698]: I0224 10:22:51.975824 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6ddb5cdb58-fw95x" event={"ID":"1d3d97e1-c75a-47ba-b524-2c2bd3b3acd8","Type":"ContainerDied","Data":"320d083463d11930bd4f2fc8d7624053394e3f8225d255ad7ccf6efe436da410"} Feb 24 10:22:51 crc kubenswrapper[4698]: I0224 10:22:51.976227 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6ddb5cdb58-fw95x" event={"ID":"1d3d97e1-c75a-47ba-b524-2c2bd3b3acd8","Type":"ContainerDied","Data":"d09713355ba0e0b974be78a315c28bba4b6660a12054a7991aaba0c390bb2c8b"} Feb 24 10:22:51 crc kubenswrapper[4698]: I0224 10:22:51.976254 4698 scope.go:117] "RemoveContainer" containerID="320d083463d11930bd4f2fc8d7624053394e3f8225d255ad7ccf6efe436da410" Feb 24 10:22:51 crc kubenswrapper[4698]: I0224 10:22:51.978253 4698 generic.go:334] "Generic (PLEG): container finished" podID="e67b79e2-0566-4e61-9489-d9fcd311e379" containerID="7e0039d496348db1b5cda7a8cfca8961c58e84a2a0c16a3880e2e247be6d7470" exitCode=0 Feb 24 10:22:51 crc kubenswrapper[4698]: I0224 10:22:51.978304 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-59cb84db4b-sd7j5" event={"ID":"e67b79e2-0566-4e61-9489-d9fcd311e379","Type":"ContainerDied","Data":"7e0039d496348db1b5cda7a8cfca8961c58e84a2a0c16a3880e2e247be6d7470"} Feb 24 10:22:51 crc kubenswrapper[4698]: I0224 10:22:51.978314 4698 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-59cb84db4b-sd7j5" Feb 24 10:22:51 crc kubenswrapper[4698]: I0224 10:22:51.978326 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-59cb84db4b-sd7j5" event={"ID":"e67b79e2-0566-4e61-9489-d9fcd311e379","Type":"ContainerDied","Data":"33bbd554acd8a30afe08c25814b7efb509f2ff9e00bf20c970ce336fe1993d9e"} Feb 24 10:22:52 crc kubenswrapper[4698]: I0224 10:22:52.000364 4698 scope.go:117] "RemoveContainer" containerID="320d083463d11930bd4f2fc8d7624053394e3f8225d255ad7ccf6efe436da410" Feb 24 10:22:52 crc kubenswrapper[4698]: E0224 10:22:52.000776 4698 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"320d083463d11930bd4f2fc8d7624053394e3f8225d255ad7ccf6efe436da410\": container with ID starting with 320d083463d11930bd4f2fc8d7624053394e3f8225d255ad7ccf6efe436da410 not found: ID does not exist" containerID="320d083463d11930bd4f2fc8d7624053394e3f8225d255ad7ccf6efe436da410" Feb 24 10:22:52 crc kubenswrapper[4698]: I0224 10:22:52.000800 4698 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"320d083463d11930bd4f2fc8d7624053394e3f8225d255ad7ccf6efe436da410"} err="failed to get container status \"320d083463d11930bd4f2fc8d7624053394e3f8225d255ad7ccf6efe436da410\": rpc error: code = NotFound desc = could not find container \"320d083463d11930bd4f2fc8d7624053394e3f8225d255ad7ccf6efe436da410\": container with ID starting with 320d083463d11930bd4f2fc8d7624053394e3f8225d255ad7ccf6efe436da410 not found: ID does not exist" Feb 24 10:22:52 crc kubenswrapper[4698]: I0224 10:22:52.000818 4698 scope.go:117] "RemoveContainer" containerID="7e0039d496348db1b5cda7a8cfca8961c58e84a2a0c16a3880e2e247be6d7470" Feb 24 10:22:52 crc kubenswrapper[4698]: I0224 10:22:52.005942 4698 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6ddb5cdb58-fw95x"] Feb 24 10:22:52 crc kubenswrapper[4698]: I0224 10:22:52.013481 4698 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6ddb5cdb58-fw95x"] Feb 24 10:22:52 crc kubenswrapper[4698]: I0224 10:22:52.016924 4698 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-59cb84db4b-sd7j5"] Feb 24 10:22:52 crc kubenswrapper[4698]: I0224 10:22:52.017346 4698 scope.go:117] "RemoveContainer" containerID="7e0039d496348db1b5cda7a8cfca8961c58e84a2a0c16a3880e2e247be6d7470" Feb 24 10:22:52 crc kubenswrapper[4698]: E0224 10:22:52.017747 4698 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e0039d496348db1b5cda7a8cfca8961c58e84a2a0c16a3880e2e247be6d7470\": container with ID starting with 7e0039d496348db1b5cda7a8cfca8961c58e84a2a0c16a3880e2e247be6d7470 not found: ID does not exist" containerID="7e0039d496348db1b5cda7a8cfca8961c58e84a2a0c16a3880e2e247be6d7470" Feb 24 10:22:52 crc kubenswrapper[4698]: I0224 10:22:52.017776 4698 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e0039d496348db1b5cda7a8cfca8961c58e84a2a0c16a3880e2e247be6d7470"} err="failed to get container status \"7e0039d496348db1b5cda7a8cfca8961c58e84a2a0c16a3880e2e247be6d7470\": rpc error: code = NotFound desc = could not find container \"7e0039d496348db1b5cda7a8cfca8961c58e84a2a0c16a3880e2e247be6d7470\": container with ID starting with 7e0039d496348db1b5cda7a8cfca8961c58e84a2a0c16a3880e2e247be6d7470 not found: ID does not exist" Feb 24 10:22:52 crc kubenswrapper[4698]: I0224 10:22:52.019591 4698 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-59cb84db4b-sd7j5"] Feb 24 10:22:52 crc kubenswrapper[4698]: I0224 10:22:52.169145 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-69c79dd4cc-qvzmz"] Feb 24 10:22:52 crc kubenswrapper[4698]: W0224 10:22:52.189436 4698 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c6f37d6_c4a7_48ba_bb37_18128d69c384.slice/crio-b3294df224fed6eab3c6e8416ac26818f60bd123598fc2e558e0096fdec68448 WatchSource:0}: Error finding container b3294df224fed6eab3c6e8416ac26818f60bd123598fc2e558e0096fdec68448: Status 404 returned error can't find the container with id b3294df224fed6eab3c6e8416ac26818f60bd123598fc2e558e0096fdec68448 Feb 24 10:22:52 crc kubenswrapper[4698]: I0224 10:22:52.985940 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-69c79dd4cc-qvzmz" event={"ID":"9c6f37d6-c4a7-48ba-bb37-18128d69c384","Type":"ContainerStarted","Data":"74dd676bd241111954d03937e9d287715e722c778d31ebf2ff942bf5f142d520"} Feb 24 10:22:52 crc kubenswrapper[4698]: I0224 10:22:52.986280 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-69c79dd4cc-qvzmz" event={"ID":"9c6f37d6-c4a7-48ba-bb37-18128d69c384","Type":"ContainerStarted","Data":"b3294df224fed6eab3c6e8416ac26818f60bd123598fc2e558e0096fdec68448"} Feb 24 10:22:52 crc kubenswrapper[4698]: I0224 10:22:52.986782 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-69c79dd4cc-qvzmz" Feb 24 10:22:52 crc kubenswrapper[4698]: I0224 10:22:52.992138 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-69c79dd4cc-qvzmz" Feb 24 10:22:53 crc kubenswrapper[4698]: I0224 10:22:53.009977 4698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-69c79dd4cc-qvzmz" podStartSLOduration=4.009954461 podStartE2EDuration="4.009954461s" podCreationTimestamp="2026-02-24 10:22:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:22:53.006590379 +0000 UTC m=+398.120204630" watchObservedRunningTime="2026-02-24 10:22:53.009954461 +0000 UTC m=+398.123568702" Feb 24 10:22:53 crc kubenswrapper[4698]: I0224 10:22:53.586521 4698 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-hxxxs"] Feb 24 10:22:53 crc kubenswrapper[4698]: I0224 10:22:53.623512 4698 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d3d97e1-c75a-47ba-b524-2c2bd3b3acd8" path="/var/lib/kubelet/pods/1d3d97e1-c75a-47ba-b524-2c2bd3b3acd8/volumes" Feb 24 10:22:53 crc kubenswrapper[4698]: I0224 10:22:53.625635 4698 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e67b79e2-0566-4e61-9489-d9fcd311e379" path="/var/lib/kubelet/pods/e67b79e2-0566-4e61-9489-d9fcd311e379/volumes" Feb 24 10:22:53 crc kubenswrapper[4698]: I0224 10:22:53.889845 4698 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5db558bd57-kg5cl"] Feb 24 10:22:53 crc kubenswrapper[4698]: E0224 10:22:53.890380 4698 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e67b79e2-0566-4e61-9489-d9fcd311e379" containerName="controller-manager" Feb 24 10:22:53 crc kubenswrapper[4698]: I0224 10:22:53.890392 4698 state_mem.go:107] "Deleted CPUSet assignment" podUID="e67b79e2-0566-4e61-9489-d9fcd311e379" containerName="controller-manager" Feb 24 10:22:53 crc kubenswrapper[4698]: I0224 10:22:53.890482 4698 memory_manager.go:354] "RemoveStaleState removing state" podUID="e67b79e2-0566-4e61-9489-d9fcd311e379" containerName="controller-manager" Feb 24 10:22:53 crc kubenswrapper[4698]: I0224 10:22:53.890796 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5db558bd57-kg5cl" Feb 24 10:22:53 crc kubenswrapper[4698]: I0224 10:22:53.892984 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 24 10:22:53 crc kubenswrapper[4698]: I0224 10:22:53.893147 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 24 10:22:53 crc kubenswrapper[4698]: I0224 10:22:53.893283 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 24 10:22:53 crc kubenswrapper[4698]: I0224 10:22:53.893546 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 24 10:22:53 crc kubenswrapper[4698]: I0224 10:22:53.893854 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 24 10:22:53 crc kubenswrapper[4698]: I0224 10:22:53.894096 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 24 10:22:53 crc kubenswrapper[4698]: I0224 10:22:53.902660 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 24 10:22:53 crc kubenswrapper[4698]: I0224 10:22:53.905321 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5db558bd57-kg5cl"] Feb 24 10:22:54 crc kubenswrapper[4698]: I0224 10:22:54.069397 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/626a1acd-66b2-4f6d-a334-f5b445486cc9-client-ca\") pod \"controller-manager-5db558bd57-kg5cl\" (UID: \"626a1acd-66b2-4f6d-a334-f5b445486cc9\") " pod="openshift-controller-manager/controller-manager-5db558bd57-kg5cl" Feb 24 10:22:54 crc kubenswrapper[4698]: I0224 10:22:54.069443 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/626a1acd-66b2-4f6d-a334-f5b445486cc9-config\") pod \"controller-manager-5db558bd57-kg5cl\" (UID: \"626a1acd-66b2-4f6d-a334-f5b445486cc9\") " pod="openshift-controller-manager/controller-manager-5db558bd57-kg5cl" Feb 24 10:22:54 crc kubenswrapper[4698]: I0224 10:22:54.069464 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dgxv\" (UniqueName: \"kubernetes.io/projected/626a1acd-66b2-4f6d-a334-f5b445486cc9-kube-api-access-4dgxv\") pod \"controller-manager-5db558bd57-kg5cl\" (UID: \"626a1acd-66b2-4f6d-a334-f5b445486cc9\") " pod="openshift-controller-manager/controller-manager-5db558bd57-kg5cl" Feb 24 10:22:54 crc kubenswrapper[4698]: I0224 10:22:54.069487 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/626a1acd-66b2-4f6d-a334-f5b445486cc9-proxy-ca-bundles\") pod \"controller-manager-5db558bd57-kg5cl\" (UID: \"626a1acd-66b2-4f6d-a334-f5b445486cc9\") " pod="openshift-controller-manager/controller-manager-5db558bd57-kg5cl" Feb 24 10:22:54 crc kubenswrapper[4698]: I0224 10:22:54.069853 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/626a1acd-66b2-4f6d-a334-f5b445486cc9-serving-cert\") pod \"controller-manager-5db558bd57-kg5cl\" (UID: \"626a1acd-66b2-4f6d-a334-f5b445486cc9\") " pod="openshift-controller-manager/controller-manager-5db558bd57-kg5cl" Feb 24 10:22:54 crc kubenswrapper[4698]: I0224 10:22:54.171354 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/626a1acd-66b2-4f6d-a334-f5b445486cc9-client-ca\") pod \"controller-manager-5db558bd57-kg5cl\" (UID: \"626a1acd-66b2-4f6d-a334-f5b445486cc9\") " pod="openshift-controller-manager/controller-manager-5db558bd57-kg5cl" Feb 24 10:22:54 crc kubenswrapper[4698]: I0224 10:22:54.171435 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/626a1acd-66b2-4f6d-a334-f5b445486cc9-config\") pod \"controller-manager-5db558bd57-kg5cl\" (UID: \"626a1acd-66b2-4f6d-a334-f5b445486cc9\") " pod="openshift-controller-manager/controller-manager-5db558bd57-kg5cl" Feb 24 10:22:54 crc kubenswrapper[4698]: I0224 10:22:54.171462 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dgxv\" (UniqueName: \"kubernetes.io/projected/626a1acd-66b2-4f6d-a334-f5b445486cc9-kube-api-access-4dgxv\") pod \"controller-manager-5db558bd57-kg5cl\" (UID: \"626a1acd-66b2-4f6d-a334-f5b445486cc9\") " pod="openshift-controller-manager/controller-manager-5db558bd57-kg5cl" Feb 24 10:22:54 crc kubenswrapper[4698]: I0224 10:22:54.171494 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/626a1acd-66b2-4f6d-a334-f5b445486cc9-proxy-ca-bundles\") pod \"controller-manager-5db558bd57-kg5cl\" (UID: \"626a1acd-66b2-4f6d-a334-f5b445486cc9\") " pod="openshift-controller-manager/controller-manager-5db558bd57-kg5cl" Feb 24 10:22:54 crc kubenswrapper[4698]: I0224 10:22:54.171547 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/626a1acd-66b2-4f6d-a334-f5b445486cc9-serving-cert\") pod \"controller-manager-5db558bd57-kg5cl\" (UID: \"626a1acd-66b2-4f6d-a334-f5b445486cc9\") " pod="openshift-controller-manager/controller-manager-5db558bd57-kg5cl" Feb 24 10:22:54 crc kubenswrapper[4698]: I0224 10:22:54.172869 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/626a1acd-66b2-4f6d-a334-f5b445486cc9-client-ca\") pod \"controller-manager-5db558bd57-kg5cl\" (UID: \"626a1acd-66b2-4f6d-a334-f5b445486cc9\") " pod="openshift-controller-manager/controller-manager-5db558bd57-kg5cl" Feb 24 10:22:54 crc kubenswrapper[4698]: I0224 10:22:54.173223 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/626a1acd-66b2-4f6d-a334-f5b445486cc9-proxy-ca-bundles\") pod \"controller-manager-5db558bd57-kg5cl\" (UID: \"626a1acd-66b2-4f6d-a334-f5b445486cc9\") " pod="openshift-controller-manager/controller-manager-5db558bd57-kg5cl" Feb 24 10:22:54 crc kubenswrapper[4698]: I0224 10:22:54.173298 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/626a1acd-66b2-4f6d-a334-f5b445486cc9-config\") pod \"controller-manager-5db558bd57-kg5cl\" (UID: \"626a1acd-66b2-4f6d-a334-f5b445486cc9\") " pod="openshift-controller-manager/controller-manager-5db558bd57-kg5cl" Feb 24 10:22:54 crc kubenswrapper[4698]: I0224 10:22:54.177759 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/626a1acd-66b2-4f6d-a334-f5b445486cc9-serving-cert\") pod \"controller-manager-5db558bd57-kg5cl\" (UID: \"626a1acd-66b2-4f6d-a334-f5b445486cc9\") " pod="openshift-controller-manager/controller-manager-5db558bd57-kg5cl" Feb 24 10:22:54 crc kubenswrapper[4698]: I0224 10:22:54.190795 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dgxv\" (UniqueName: \"kubernetes.io/projected/626a1acd-66b2-4f6d-a334-f5b445486cc9-kube-api-access-4dgxv\") pod \"controller-manager-5db558bd57-kg5cl\" (UID: \"626a1acd-66b2-4f6d-a334-f5b445486cc9\") " pod="openshift-controller-manager/controller-manager-5db558bd57-kg5cl" Feb 24 10:22:54 crc kubenswrapper[4698]: I0224 10:22:54.265985 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5db558bd57-kg5cl" Feb 24 10:22:54 crc kubenswrapper[4698]: I0224 10:22:54.660753 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5db558bd57-kg5cl"] Feb 24 10:22:54 crc kubenswrapper[4698]: W0224 10:22:54.668273 4698 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod626a1acd_66b2_4f6d_a334_f5b445486cc9.slice/crio-a85cd983fee1634d0f8877013d2c6f942eb40b0d8f31bb643a6ada08cd82a796 WatchSource:0}: Error finding container a85cd983fee1634d0f8877013d2c6f942eb40b0d8f31bb643a6ada08cd82a796: Status 404 returned error can't find the container with id a85cd983fee1634d0f8877013d2c6f942eb40b0d8f31bb643a6ada08cd82a796 Feb 24 10:22:55 crc kubenswrapper[4698]: I0224 10:22:55.012636 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5db558bd57-kg5cl" event={"ID":"626a1acd-66b2-4f6d-a334-f5b445486cc9","Type":"ContainerStarted","Data":"a85cd983fee1634d0f8877013d2c6f942eb40b0d8f31bb643a6ada08cd82a796"} Feb 24 10:22:56 crc kubenswrapper[4698]: I0224 10:22:56.018656 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5db558bd57-kg5cl" event={"ID":"626a1acd-66b2-4f6d-a334-f5b445486cc9","Type":"ContainerStarted","Data":"f7ff096415c8a1fba1127e00c59d8e7243a389553c09dc8d8c67b98173eaa28d"} Feb 24 10:22:56 crc kubenswrapper[4698]: I0224 10:22:56.019335 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5db558bd57-kg5cl" Feb 24 10:22:56 crc kubenswrapper[4698]: I0224 10:22:56.021096 4698 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Feb 24 10:22:56 crc kubenswrapper[4698]: I0224 10:22:56.022784 4698 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Feb 24 10:22:56 crc kubenswrapper[4698]: I0224 10:22:56.023329 4698 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 24 10:22:56 crc kubenswrapper[4698]: I0224 10:22:56.023389 4698 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="ad130d0c31b31375e460ffddd5711a065eaaa0c06c4ef3d80a8bcf4702263046" exitCode=137 Feb 24 10:22:56 crc kubenswrapper[4698]: I0224 10:22:56.023418 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"ad130d0c31b31375e460ffddd5711a065eaaa0c06c4ef3d80a8bcf4702263046"} Feb 24 10:22:56 crc kubenswrapper[4698]: I0224 10:22:56.023451 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"1b8c37f6d2fcfc2bed3f3f86065a884d5dd6c22792e1abb32d5b6e994d67e0f3"} Feb 24 10:22:56 crc kubenswrapper[4698]: I0224 10:22:56.023469 4698 scope.go:117] "RemoveContainer" containerID="e0a9191217045254bf454800fc32d325cc4450d0d4d0d9b6fb4bd6a438872cd9" Feb 24 10:22:56 crc kubenswrapper[4698]: I0224 10:22:56.027367 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5db558bd57-kg5cl" Feb 24 10:22:56 crc kubenswrapper[4698]: I0224 10:22:56.048625 4698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5db558bd57-kg5cl" podStartSLOduration=7.048606814 podStartE2EDuration="7.048606814s" podCreationTimestamp="2026-02-24 10:22:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:22:56.04508012 +0000 UTC m=+401.158694361" watchObservedRunningTime="2026-02-24 10:22:56.048606814 +0000 UTC m=+401.162221055" Feb 24 10:22:56 crc kubenswrapper[4698]: I0224 10:22:56.162714 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 24 10:22:57 crc kubenswrapper[4698]: I0224 10:22:57.030183 4698 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Feb 24 10:22:57 crc kubenswrapper[4698]: I0224 10:22:57.033592 4698 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Feb 24 10:22:57 crc kubenswrapper[4698]: I0224 10:22:57.380966 4698 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-lfxsn"] Feb 24 10:22:57 crc kubenswrapper[4698]: I0224 10:22:57.382397 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lfxsn" Feb 24 10:22:57 crc kubenswrapper[4698]: I0224 10:22:57.384397 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 24 10:22:57 crc kubenswrapper[4698]: I0224 10:22:57.405447 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lfxsn"] Feb 24 10:22:57 crc kubenswrapper[4698]: I0224 10:22:57.516834 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/840583fb-7e48-4ff2-b86a-43b0ce60f5c6-utilities\") pod \"redhat-operators-lfxsn\" (UID: \"840583fb-7e48-4ff2-b86a-43b0ce60f5c6\") " pod="openshift-marketplace/redhat-operators-lfxsn" Feb 24 10:22:57 crc kubenswrapper[4698]: I0224 10:22:57.516870 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/840583fb-7e48-4ff2-b86a-43b0ce60f5c6-catalog-content\") pod \"redhat-operators-lfxsn\" (UID: \"840583fb-7e48-4ff2-b86a-43b0ce60f5c6\") " pod="openshift-marketplace/redhat-operators-lfxsn" Feb 24 10:22:57 crc kubenswrapper[4698]: I0224 10:22:57.516907 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqh8v\" (UniqueName: \"kubernetes.io/projected/840583fb-7e48-4ff2-b86a-43b0ce60f5c6-kube-api-access-pqh8v\") pod \"redhat-operators-lfxsn\" (UID: \"840583fb-7e48-4ff2-b86a-43b0ce60f5c6\") " pod="openshift-marketplace/redhat-operators-lfxsn" Feb 24 10:22:57 crc kubenswrapper[4698]: I0224 10:22:57.586711 4698 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-tp8lh"] Feb 24 10:22:57 crc kubenswrapper[4698]: I0224 10:22:57.587691 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tp8lh" Feb 24 10:22:57 crc kubenswrapper[4698]: I0224 10:22:57.589577 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 24 10:22:57 crc kubenswrapper[4698]: I0224 10:22:57.618379 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d64eda8-e167-4909-b0f0-6308f777cb05-catalog-content\") pod \"community-operators-tp8lh\" (UID: \"5d64eda8-e167-4909-b0f0-6308f777cb05\") " pod="openshift-marketplace/community-operators-tp8lh" Feb 24 10:22:57 crc kubenswrapper[4698]: I0224 10:22:57.618444 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/840583fb-7e48-4ff2-b86a-43b0ce60f5c6-utilities\") pod \"redhat-operators-lfxsn\" (UID: \"840583fb-7e48-4ff2-b86a-43b0ce60f5c6\") " pod="openshift-marketplace/redhat-operators-lfxsn" Feb 24 10:22:57 crc kubenswrapper[4698]: I0224 10:22:57.618472 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/840583fb-7e48-4ff2-b86a-43b0ce60f5c6-catalog-content\") pod \"redhat-operators-lfxsn\" (UID: \"840583fb-7e48-4ff2-b86a-43b0ce60f5c6\") " pod="openshift-marketplace/redhat-operators-lfxsn" Feb 24 10:22:57 crc kubenswrapper[4698]: I0224 10:22:57.618529 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqh8v\" (UniqueName: \"kubernetes.io/projected/840583fb-7e48-4ff2-b86a-43b0ce60f5c6-kube-api-access-pqh8v\") pod \"redhat-operators-lfxsn\" (UID: \"840583fb-7e48-4ff2-b86a-43b0ce60f5c6\") " pod="openshift-marketplace/redhat-operators-lfxsn" Feb 24 10:22:57 crc kubenswrapper[4698]: I0224 10:22:57.618647 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhc6g\" (UniqueName: \"kubernetes.io/projected/5d64eda8-e167-4909-b0f0-6308f777cb05-kube-api-access-bhc6g\") pod \"community-operators-tp8lh\" (UID: \"5d64eda8-e167-4909-b0f0-6308f777cb05\") " pod="openshift-marketplace/community-operators-tp8lh" Feb 24 10:22:57 crc kubenswrapper[4698]: I0224 10:22:57.618718 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d64eda8-e167-4909-b0f0-6308f777cb05-utilities\") pod \"community-operators-tp8lh\" (UID: \"5d64eda8-e167-4909-b0f0-6308f777cb05\") " pod="openshift-marketplace/community-operators-tp8lh" Feb 24 10:22:57 crc kubenswrapper[4698]: I0224 10:22:57.619824 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/840583fb-7e48-4ff2-b86a-43b0ce60f5c6-utilities\") pod \"redhat-operators-lfxsn\" (UID: \"840583fb-7e48-4ff2-b86a-43b0ce60f5c6\") " pod="openshift-marketplace/redhat-operators-lfxsn" Feb 24 10:22:57 crc kubenswrapper[4698]: I0224 10:22:57.620432 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/840583fb-7e48-4ff2-b86a-43b0ce60f5c6-catalog-content\") pod \"redhat-operators-lfxsn\" (UID: \"840583fb-7e48-4ff2-b86a-43b0ce60f5c6\") " pod="openshift-marketplace/redhat-operators-lfxsn" Feb 24 10:22:57 crc kubenswrapper[4698]: I0224 10:22:57.640211 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqh8v\" (UniqueName: \"kubernetes.io/projected/840583fb-7e48-4ff2-b86a-43b0ce60f5c6-kube-api-access-pqh8v\") pod \"redhat-operators-lfxsn\" (UID: \"840583fb-7e48-4ff2-b86a-43b0ce60f5c6\") " pod="openshift-marketplace/redhat-operators-lfxsn" Feb 24 10:22:57 crc kubenswrapper[4698]: I0224 10:22:57.640449 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tp8lh"] Feb 24 10:22:57 crc kubenswrapper[4698]: I0224 10:22:57.697166 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lfxsn" Feb 24 10:22:57 crc kubenswrapper[4698]: I0224 10:22:57.719220 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhc6g\" (UniqueName: \"kubernetes.io/projected/5d64eda8-e167-4909-b0f0-6308f777cb05-kube-api-access-bhc6g\") pod \"community-operators-tp8lh\" (UID: \"5d64eda8-e167-4909-b0f0-6308f777cb05\") " pod="openshift-marketplace/community-operators-tp8lh" Feb 24 10:22:57 crc kubenswrapper[4698]: I0224 10:22:57.719289 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d64eda8-e167-4909-b0f0-6308f777cb05-utilities\") pod \"community-operators-tp8lh\" (UID: \"5d64eda8-e167-4909-b0f0-6308f777cb05\") " pod="openshift-marketplace/community-operators-tp8lh" Feb 24 10:22:57 crc kubenswrapper[4698]: I0224 10:22:57.719313 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d64eda8-e167-4909-b0f0-6308f777cb05-catalog-content\") pod \"community-operators-tp8lh\" (UID: \"5d64eda8-e167-4909-b0f0-6308f777cb05\") " pod="openshift-marketplace/community-operators-tp8lh" Feb 24 10:22:57 crc kubenswrapper[4698]: I0224 10:22:57.719726 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5d64eda8-e167-4909-b0f0-6308f777cb05-catalog-content\") pod \"community-operators-tp8lh\" (UID: \"5d64eda8-e167-4909-b0f0-6308f777cb05\") " pod="openshift-marketplace/community-operators-tp8lh" Feb 24 10:22:57 crc kubenswrapper[4698]: I0224 10:22:57.719914 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5d64eda8-e167-4909-b0f0-6308f777cb05-utilities\") pod \"community-operators-tp8lh\" (UID: \"5d64eda8-e167-4909-b0f0-6308f777cb05\") " pod="openshift-marketplace/community-operators-tp8lh" Feb 24 10:22:57 crc kubenswrapper[4698]: I0224 10:22:57.741821 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhc6g\" (UniqueName: \"kubernetes.io/projected/5d64eda8-e167-4909-b0f0-6308f777cb05-kube-api-access-bhc6g\") pod \"community-operators-tp8lh\" (UID: \"5d64eda8-e167-4909-b0f0-6308f777cb05\") " pod="openshift-marketplace/community-operators-tp8lh" Feb 24 10:22:57 crc kubenswrapper[4698]: I0224 10:22:57.912367 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tp8lh" Feb 24 10:22:58 crc kubenswrapper[4698]: I0224 10:22:58.101446 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lfxsn"] Feb 24 10:22:58 crc kubenswrapper[4698]: W0224 10:22:58.108536 4698 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod840583fb_7e48_4ff2_b86a_43b0ce60f5c6.slice/crio-467f05a3e52b10e451efe240fc3b5f287aa635f2f1e85de579ab1362a1709cd3 WatchSource:0}: Error finding container 467f05a3e52b10e451efe240fc3b5f287aa635f2f1e85de579ab1362a1709cd3: Status 404 returned error can't find the container with id 467f05a3e52b10e451efe240fc3b5f287aa635f2f1e85de579ab1362a1709cd3 Feb 24 10:22:58 crc kubenswrapper[4698]: I0224 10:22:58.285766 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tp8lh"] Feb 24 10:22:58 crc kubenswrapper[4698]: W0224 10:22:58.294757 4698 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d64eda8_e167_4909_b0f0_6308f777cb05.slice/crio-1e3960e71f674655f6466674df890dc80811cccdfbc553e0ff712b8400749398 WatchSource:0}: Error finding container 1e3960e71f674655f6466674df890dc80811cccdfbc553e0ff712b8400749398: Status 404 returned error can't find the container with id 1e3960e71f674655f6466674df890dc80811cccdfbc553e0ff712b8400749398 Feb 24 10:22:59 crc kubenswrapper[4698]: I0224 10:22:59.048024 4698 generic.go:334] "Generic (PLEG): container finished" podID="5d64eda8-e167-4909-b0f0-6308f777cb05" containerID="2e0cc5b7a3f1feb064058dd738b4f48350ed8b747e9aa3709db0b87c35b7938c" exitCode=0 Feb 24 10:22:59 crc kubenswrapper[4698]: I0224 10:22:59.048437 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tp8lh" event={"ID":"5d64eda8-e167-4909-b0f0-6308f777cb05","Type":"ContainerDied","Data":"2e0cc5b7a3f1feb064058dd738b4f48350ed8b747e9aa3709db0b87c35b7938c"} Feb 24 10:22:59 crc kubenswrapper[4698]: I0224 10:22:59.048765 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tp8lh" event={"ID":"5d64eda8-e167-4909-b0f0-6308f777cb05","Type":"ContainerStarted","Data":"1e3960e71f674655f6466674df890dc80811cccdfbc553e0ff712b8400749398"} Feb 24 10:22:59 crc kubenswrapper[4698]: I0224 10:22:59.050165 4698 generic.go:334] "Generic (PLEG): container finished" podID="840583fb-7e48-4ff2-b86a-43b0ce60f5c6" containerID="2262ad34ae5f0a9ba6c83ca487e1aba3a7bafbf5f35611a51f2675bad2ce1926" exitCode=0 Feb 24 10:22:59 crc kubenswrapper[4698]: I0224 10:22:59.050219 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lfxsn" event={"ID":"840583fb-7e48-4ff2-b86a-43b0ce60f5c6","Type":"ContainerDied","Data":"2262ad34ae5f0a9ba6c83ca487e1aba3a7bafbf5f35611a51f2675bad2ce1926"} Feb 24 10:22:59 crc kubenswrapper[4698]: I0224 10:22:59.050280 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lfxsn" event={"ID":"840583fb-7e48-4ff2-b86a-43b0ce60f5c6","Type":"ContainerStarted","Data":"467f05a3e52b10e451efe240fc3b5f287aa635f2f1e85de579ab1362a1709cd3"} Feb 24 10:22:59 crc kubenswrapper[4698]: I0224 10:22:59.979851 4698 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-q8s99"] Feb 24 10:22:59 crc kubenswrapper[4698]: I0224 10:22:59.981336 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q8s99" Feb 24 10:22:59 crc kubenswrapper[4698]: I0224 10:22:59.983395 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 24 10:22:59 crc kubenswrapper[4698]: I0224 10:22:59.988359 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-q8s99"] Feb 24 10:23:00 crc kubenswrapper[4698]: I0224 10:23:00.147602 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4e74867-0195-4672-8985-40efe6141cc9-utilities\") pod \"certified-operators-q8s99\" (UID: \"a4e74867-0195-4672-8985-40efe6141cc9\") " pod="openshift-marketplace/certified-operators-q8s99" Feb 24 10:23:00 crc kubenswrapper[4698]: I0224 10:23:00.147710 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4e74867-0195-4672-8985-40efe6141cc9-catalog-content\") pod \"certified-operators-q8s99\" (UID: \"a4e74867-0195-4672-8985-40efe6141cc9\") " pod="openshift-marketplace/certified-operators-q8s99" Feb 24 10:23:00 crc kubenswrapper[4698]: I0224 10:23:00.147747 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfj4m\" (UniqueName: \"kubernetes.io/projected/a4e74867-0195-4672-8985-40efe6141cc9-kube-api-access-sfj4m\") pod \"certified-operators-q8s99\" (UID: \"a4e74867-0195-4672-8985-40efe6141cc9\") " pod="openshift-marketplace/certified-operators-q8s99" Feb 24 10:23:00 crc kubenswrapper[4698]: I0224 10:23:00.178802 4698 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-r2v4d"] Feb 24 10:23:00 crc kubenswrapper[4698]: I0224 10:23:00.179913 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r2v4d" Feb 24 10:23:00 crc kubenswrapper[4698]: I0224 10:23:00.184030 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 24 10:23:00 crc kubenswrapper[4698]: I0224 10:23:00.190505 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-r2v4d"] Feb 24 10:23:00 crc kubenswrapper[4698]: I0224 10:23:00.248775 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4e74867-0195-4672-8985-40efe6141cc9-utilities\") pod \"certified-operators-q8s99\" (UID: \"a4e74867-0195-4672-8985-40efe6141cc9\") " pod="openshift-marketplace/certified-operators-q8s99" Feb 24 10:23:00 crc kubenswrapper[4698]: I0224 10:23:00.248841 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4e74867-0195-4672-8985-40efe6141cc9-catalog-content\") pod \"certified-operators-q8s99\" (UID: \"a4e74867-0195-4672-8985-40efe6141cc9\") " pod="openshift-marketplace/certified-operators-q8s99" Feb 24 10:23:00 crc kubenswrapper[4698]: I0224 10:23:00.248876 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfj4m\" (UniqueName: \"kubernetes.io/projected/a4e74867-0195-4672-8985-40efe6141cc9-kube-api-access-sfj4m\") pod \"certified-operators-q8s99\" (UID: \"a4e74867-0195-4672-8985-40efe6141cc9\") " pod="openshift-marketplace/certified-operators-q8s99" Feb 24 10:23:00 crc kubenswrapper[4698]: I0224 10:23:00.249309 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4e74867-0195-4672-8985-40efe6141cc9-catalog-content\") pod \"certified-operators-q8s99\" (UID: \"a4e74867-0195-4672-8985-40efe6141cc9\") " pod="openshift-marketplace/certified-operators-q8s99" Feb 24 10:23:00 crc kubenswrapper[4698]: I0224 10:23:00.249376 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4e74867-0195-4672-8985-40efe6141cc9-utilities\") pod \"certified-operators-q8s99\" (UID: \"a4e74867-0195-4672-8985-40efe6141cc9\") " pod="openshift-marketplace/certified-operators-q8s99" Feb 24 10:23:00 crc kubenswrapper[4698]: I0224 10:23:00.270156 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfj4m\" (UniqueName: \"kubernetes.io/projected/a4e74867-0195-4672-8985-40efe6141cc9-kube-api-access-sfj4m\") pod \"certified-operators-q8s99\" (UID: \"a4e74867-0195-4672-8985-40efe6141cc9\") " pod="openshift-marketplace/certified-operators-q8s99" Feb 24 10:23:00 crc kubenswrapper[4698]: I0224 10:23:00.311469 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q8s99" Feb 24 10:23:00 crc kubenswrapper[4698]: I0224 10:23:00.350464 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvw4w\" (UniqueName: \"kubernetes.io/projected/bd7dda50-f0b2-4d3e-a4af-55644af0b968-kube-api-access-zvw4w\") pod \"redhat-marketplace-r2v4d\" (UID: \"bd7dda50-f0b2-4d3e-a4af-55644af0b968\") " pod="openshift-marketplace/redhat-marketplace-r2v4d" Feb 24 10:23:00 crc kubenswrapper[4698]: I0224 10:23:00.350563 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd7dda50-f0b2-4d3e-a4af-55644af0b968-catalog-content\") pod \"redhat-marketplace-r2v4d\" (UID: \"bd7dda50-f0b2-4d3e-a4af-55644af0b968\") " pod="openshift-marketplace/redhat-marketplace-r2v4d" Feb 24 10:23:00 crc kubenswrapper[4698]: I0224 10:23:00.350621 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd7dda50-f0b2-4d3e-a4af-55644af0b968-utilities\") pod \"redhat-marketplace-r2v4d\" (UID: \"bd7dda50-f0b2-4d3e-a4af-55644af0b968\") " pod="openshift-marketplace/redhat-marketplace-r2v4d" Feb 24 10:23:00 crc kubenswrapper[4698]: I0224 10:23:00.452353 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvw4w\" (UniqueName: \"kubernetes.io/projected/bd7dda50-f0b2-4d3e-a4af-55644af0b968-kube-api-access-zvw4w\") pod \"redhat-marketplace-r2v4d\" (UID: \"bd7dda50-f0b2-4d3e-a4af-55644af0b968\") " pod="openshift-marketplace/redhat-marketplace-r2v4d" Feb 24 10:23:00 crc kubenswrapper[4698]: I0224 10:23:00.452680 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd7dda50-f0b2-4d3e-a4af-55644af0b968-catalog-content\") pod \"redhat-marketplace-r2v4d\" (UID: \"bd7dda50-f0b2-4d3e-a4af-55644af0b968\") " pod="openshift-marketplace/redhat-marketplace-r2v4d" Feb 24 10:23:00 crc kubenswrapper[4698]: I0224 10:23:00.452708 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd7dda50-f0b2-4d3e-a4af-55644af0b968-utilities\") pod \"redhat-marketplace-r2v4d\" (UID: \"bd7dda50-f0b2-4d3e-a4af-55644af0b968\") " pod="openshift-marketplace/redhat-marketplace-r2v4d" Feb 24 10:23:00 crc kubenswrapper[4698]: I0224 10:23:00.453543 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd7dda50-f0b2-4d3e-a4af-55644af0b968-utilities\") pod \"redhat-marketplace-r2v4d\" (UID: \"bd7dda50-f0b2-4d3e-a4af-55644af0b968\") " pod="openshift-marketplace/redhat-marketplace-r2v4d" Feb 24 10:23:00 crc kubenswrapper[4698]: I0224 10:23:00.454085 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd7dda50-f0b2-4d3e-a4af-55644af0b968-catalog-content\") pod \"redhat-marketplace-r2v4d\" (UID: \"bd7dda50-f0b2-4d3e-a4af-55644af0b968\") " pod="openshift-marketplace/redhat-marketplace-r2v4d" Feb 24 10:23:00 crc kubenswrapper[4698]: I0224 10:23:00.471476 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvw4w\" (UniqueName: \"kubernetes.io/projected/bd7dda50-f0b2-4d3e-a4af-55644af0b968-kube-api-access-zvw4w\") pod \"redhat-marketplace-r2v4d\" (UID: \"bd7dda50-f0b2-4d3e-a4af-55644af0b968\") " pod="openshift-marketplace/redhat-marketplace-r2v4d" Feb 24 10:23:00 crc kubenswrapper[4698]: I0224 10:23:00.503749 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-r2v4d" Feb 24 10:23:00 crc kubenswrapper[4698]: I0224 10:23:00.720776 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-q8s99"] Feb 24 10:23:00 crc kubenswrapper[4698]: W0224 10:23:00.730380 4698 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda4e74867_0195_4672_8985_40efe6141cc9.slice/crio-1b0b712994f2960e68ea1da493d1ec25760267c078b995a507c50f1f941c4fe3 WatchSource:0}: Error finding container 1b0b712994f2960e68ea1da493d1ec25760267c078b995a507c50f1f941c4fe3: Status 404 returned error can't find the container with id 1b0b712994f2960e68ea1da493d1ec25760267c078b995a507c50f1f941c4fe3 Feb 24 10:23:00 crc kubenswrapper[4698]: I0224 10:23:00.917735 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-r2v4d"] Feb 24 10:23:00 crc kubenswrapper[4698]: W0224 10:23:00.924936 4698 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd7dda50_f0b2_4d3e_a4af_55644af0b968.slice/crio-dfea1d174bfdcb934637f806b74eba0ff9548f2d28f83e6658ec5226295c725d WatchSource:0}: Error finding container dfea1d174bfdcb934637f806b74eba0ff9548f2d28f83e6658ec5226295c725d: Status 404 returned error can't find the container with id dfea1d174bfdcb934637f806b74eba0ff9548f2d28f83e6658ec5226295c725d Feb 24 10:23:01 crc kubenswrapper[4698]: I0224 10:23:01.069638 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lfxsn" event={"ID":"840583fb-7e48-4ff2-b86a-43b0ce60f5c6","Type":"ContainerStarted","Data":"3c515b8fbdbbd27799bea47f1083c63db9ee24f1610d5cf7d06615c5c7f3421e"} Feb 24 10:23:01 crc kubenswrapper[4698]: I0224 10:23:01.072943 4698 generic.go:334] "Generic (PLEG): container finished" podID="5d64eda8-e167-4909-b0f0-6308f777cb05" containerID="ddac2163f9b0b82c5ba4451cfa4bd6feaf156e61604ebff6a6c87f012c9624f1" exitCode=0 Feb 24 10:23:01 crc kubenswrapper[4698]: I0224 10:23:01.073053 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tp8lh" event={"ID":"5d64eda8-e167-4909-b0f0-6308f777cb05","Type":"ContainerDied","Data":"ddac2163f9b0b82c5ba4451cfa4bd6feaf156e61604ebff6a6c87f012c9624f1"} Feb 24 10:23:01 crc kubenswrapper[4698]: I0224 10:23:01.074996 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r2v4d" event={"ID":"bd7dda50-f0b2-4d3e-a4af-55644af0b968","Type":"ContainerStarted","Data":"dfea1d174bfdcb934637f806b74eba0ff9548f2d28f83e6658ec5226295c725d"} Feb 24 10:23:01 crc kubenswrapper[4698]: I0224 10:23:01.077803 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q8s99" event={"ID":"a4e74867-0195-4672-8985-40efe6141cc9","Type":"ContainerStarted","Data":"1b0b712994f2960e68ea1da493d1ec25760267c078b995a507c50f1f941c4fe3"} Feb 24 10:23:02 crc kubenswrapper[4698]: I0224 10:23:02.088866 4698 generic.go:334] "Generic (PLEG): container finished" podID="840583fb-7e48-4ff2-b86a-43b0ce60f5c6" containerID="3c515b8fbdbbd27799bea47f1083c63db9ee24f1610d5cf7d06615c5c7f3421e" exitCode=0 Feb 24 10:23:02 crc kubenswrapper[4698]: I0224 10:23:02.088971 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lfxsn" event={"ID":"840583fb-7e48-4ff2-b86a-43b0ce60f5c6","Type":"ContainerDied","Data":"3c515b8fbdbbd27799bea47f1083c63db9ee24f1610d5cf7d06615c5c7f3421e"} Feb 24 10:23:02 crc kubenswrapper[4698]: I0224 10:23:02.095863 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tp8lh" event={"ID":"5d64eda8-e167-4909-b0f0-6308f777cb05","Type":"ContainerStarted","Data":"1143934187edd976a5de090200ee60aa101934b0d7342880b6760173093a152e"} Feb 24 10:23:02 crc kubenswrapper[4698]: I0224 10:23:02.099534 4698 generic.go:334] "Generic (PLEG): container finished" podID="bd7dda50-f0b2-4d3e-a4af-55644af0b968" containerID="dc5e5744076a4f6a13a902d246c9aa26e9b821cd46bd58388f42eaeafb4f5c02" exitCode=0 Feb 24 10:23:02 crc kubenswrapper[4698]: I0224 10:23:02.099630 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r2v4d" event={"ID":"bd7dda50-f0b2-4d3e-a4af-55644af0b968","Type":"ContainerDied","Data":"dc5e5744076a4f6a13a902d246c9aa26e9b821cd46bd58388f42eaeafb4f5c02"} Feb 24 10:23:02 crc kubenswrapper[4698]: I0224 10:23:02.101967 4698 generic.go:334] "Generic (PLEG): container finished" podID="a4e74867-0195-4672-8985-40efe6141cc9" containerID="c17a43655e676e88a5aa17065438406aa37fd6aa5fc7d19e70712585bf6855a3" exitCode=0 Feb 24 10:23:02 crc kubenswrapper[4698]: I0224 10:23:02.102021 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q8s99" event={"ID":"a4e74867-0195-4672-8985-40efe6141cc9","Type":"ContainerDied","Data":"c17a43655e676e88a5aa17065438406aa37fd6aa5fc7d19e70712585bf6855a3"} Feb 24 10:23:03 crc kubenswrapper[4698]: I0224 10:23:03.112377 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lfxsn" event={"ID":"840583fb-7e48-4ff2-b86a-43b0ce60f5c6","Type":"ContainerStarted","Data":"18b94b0dc58e8672caaed62760dfe249ef3d96b3392fa2b99b7551cbdb25e477"} Feb 24 10:23:03 crc kubenswrapper[4698]: I0224 10:23:03.130532 4698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-lfxsn" podStartSLOduration=2.677375366 podStartE2EDuration="6.130519203s" podCreationTimestamp="2026-02-24 10:22:57 +0000 UTC" firstStartedPulling="2026-02-24 10:22:59.053643615 +0000 UTC m=+404.167257856" lastFinishedPulling="2026-02-24 10:23:02.506787452 +0000 UTC m=+407.620401693" observedRunningTime="2026-02-24 10:23:03.128934415 +0000 UTC m=+408.242548656" watchObservedRunningTime="2026-02-24 10:23:03.130519203 +0000 UTC m=+408.244133444" Feb 24 10:23:03 crc kubenswrapper[4698]: I0224 10:23:03.131103 4698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-tp8lh" podStartSLOduration=3.444112078 podStartE2EDuration="6.131098378s" podCreationTimestamp="2026-02-24 10:22:57 +0000 UTC" firstStartedPulling="2026-02-24 10:22:59.052440437 +0000 UTC m=+404.166054688" lastFinishedPulling="2026-02-24 10:23:01.739426747 +0000 UTC m=+406.853040988" observedRunningTime="2026-02-24 10:23:02.170924001 +0000 UTC m=+407.284538242" watchObservedRunningTime="2026-02-24 10:23:03.131098378 +0000 UTC m=+408.244712619" Feb 24 10:23:04 crc kubenswrapper[4698]: I0224 10:23:04.117400 4698 generic.go:334] "Generic (PLEG): container finished" podID="a4e74867-0195-4672-8985-40efe6141cc9" containerID="c81cceaa06f56711cb1b4b4c089221686076399d31e0dfca1c3ba0fb57ede746" exitCode=0 Feb 24 10:23:04 crc kubenswrapper[4698]: I0224 10:23:04.117438 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q8s99" event={"ID":"a4e74867-0195-4672-8985-40efe6141cc9","Type":"ContainerDied","Data":"c81cceaa06f56711cb1b4b4c089221686076399d31e0dfca1c3ba0fb57ede746"} Feb 24 10:23:04 crc kubenswrapper[4698]: I0224 10:23:04.119935 4698 generic.go:334] "Generic (PLEG): container finished" podID="bd7dda50-f0b2-4d3e-a4af-55644af0b968" containerID="f4115d90a3cd551ca25ddab5d7cef0258e01603cf76d2126dec6343cb506420e" exitCode=0 Feb 24 10:23:04 crc kubenswrapper[4698]: I0224 10:23:04.120215 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r2v4d" event={"ID":"bd7dda50-f0b2-4d3e-a4af-55644af0b968","Type":"ContainerDied","Data":"f4115d90a3cd551ca25ddab5d7cef0258e01603cf76d2126dec6343cb506420e"} Feb 24 10:23:05 crc kubenswrapper[4698]: I0224 10:23:05.106714 4698 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 24 10:23:05 crc kubenswrapper[4698]: I0224 10:23:05.109696 4698 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 24 10:23:05 crc kubenswrapper[4698]: I0224 10:23:05.127325 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q8s99" event={"ID":"a4e74867-0195-4672-8985-40efe6141cc9","Type":"ContainerStarted","Data":"4843625a2aeee87beedd86cb22f836ca9d9d7f74df7532e83f443544e9230210"} Feb 24 10:23:05 crc kubenswrapper[4698]: I0224 10:23:05.130750 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-r2v4d" event={"ID":"bd7dda50-f0b2-4d3e-a4af-55644af0b968","Type":"ContainerStarted","Data":"a4bde8a97d247723b5da598b18b97283938a972ca81c64bc93dd6eaa20040056"} Feb 24 10:23:05 crc kubenswrapper[4698]: I0224 10:23:05.135119 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 24 10:23:05 crc kubenswrapper[4698]: I0224 10:23:05.148203 4698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-q8s99" podStartSLOduration=3.621800913 podStartE2EDuration="6.14818847s" podCreationTimestamp="2026-02-24 10:22:59 +0000 UTC" firstStartedPulling="2026-02-24 10:23:02.104142716 +0000 UTC m=+407.217756957" lastFinishedPulling="2026-02-24 10:23:04.630530243 +0000 UTC m=+409.744144514" observedRunningTime="2026-02-24 10:23:05.14610159 +0000 UTC m=+410.259715831" watchObservedRunningTime="2026-02-24 10:23:05.14818847 +0000 UTC m=+410.261802711" Feb 24 10:23:05 crc kubenswrapper[4698]: I0224 10:23:05.171014 4698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-r2v4d" podStartSLOduration=2.721287959 podStartE2EDuration="5.170999351s" podCreationTimestamp="2026-02-24 10:23:00 +0000 UTC" firstStartedPulling="2026-02-24 10:23:02.100939439 +0000 UTC m=+407.214553680" lastFinishedPulling="2026-02-24 10:23:04.550650831 +0000 UTC m=+409.664265072" observedRunningTime="2026-02-24 10:23:05.169453864 +0000 UTC m=+410.283068125" watchObservedRunningTime="2026-02-24 10:23:05.170999351 +0000 UTC m=+410.284613612" Feb 24 10:23:07 crc kubenswrapper[4698]: I0224 10:23:07.699104 4698 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-lfxsn" Feb 24 10:23:07 crc kubenswrapper[4698]: I0224 10:23:07.699181 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-lfxsn" Feb 24 10:23:07 crc kubenswrapper[4698]: I0224 10:23:07.912984 4698 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-tp8lh" Feb 24 10:23:07 crc kubenswrapper[4698]: I0224 10:23:07.913026 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-tp8lh" Feb 24 10:23:07 crc kubenswrapper[4698]: I0224 10:23:07.953923 4698 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-tp8lh" Feb 24 10:23:08 crc kubenswrapper[4698]: I0224 10:23:08.203004 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-tp8lh" Feb 24 10:23:08 crc kubenswrapper[4698]: I0224 10:23:08.745074 4698 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-lfxsn" podUID="840583fb-7e48-4ff2-b86a-43b0ce60f5c6" containerName="registry-server" probeResult="failure" output=< Feb 24 10:23:08 crc kubenswrapper[4698]: timeout: failed to connect service ":50051" within 1s Feb 24 10:23:08 crc kubenswrapper[4698]: > Feb 24 10:23:10 crc kubenswrapper[4698]: I0224 10:23:10.312166 4698 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-q8s99" Feb 24 10:23:10 crc kubenswrapper[4698]: I0224 10:23:10.312235 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-q8s99" Feb 24 10:23:10 crc kubenswrapper[4698]: I0224 10:23:10.347879 4698 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-q8s99" Feb 24 10:23:10 crc kubenswrapper[4698]: I0224 10:23:10.505071 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-r2v4d" Feb 24 10:23:10 crc kubenswrapper[4698]: I0224 10:23:10.505103 4698 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-r2v4d" Feb 24 10:23:10 crc kubenswrapper[4698]: I0224 10:23:10.607898 4698 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-r2v4d" Feb 24 10:23:11 crc kubenswrapper[4698]: I0224 10:23:11.222922 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-q8s99" Feb 24 10:23:11 crc kubenswrapper[4698]: I0224 10:23:11.233539 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-r2v4d" Feb 24 10:23:15 crc kubenswrapper[4698]: I0224 10:23:15.477629 4698 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-69c79dd4cc-qvzmz"] Feb 24 10:23:15 crc kubenswrapper[4698]: I0224 10:23:15.478392 4698 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-69c79dd4cc-qvzmz" podUID="9c6f37d6-c4a7-48ba-bb37-18128d69c384" containerName="route-controller-manager" containerID="cri-o://74dd676bd241111954d03937e9d287715e722c778d31ebf2ff942bf5f142d520" gracePeriod=30 Feb 24 10:23:15 crc kubenswrapper[4698]: I0224 10:23:15.923713 4698 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-69c79dd4cc-qvzmz" Feb 24 10:23:16 crc kubenswrapper[4698]: I0224 10:23:16.046943 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c6f37d6-c4a7-48ba-bb37-18128d69c384-serving-cert\") pod \"9c6f37d6-c4a7-48ba-bb37-18128d69c384\" (UID: \"9c6f37d6-c4a7-48ba-bb37-18128d69c384\") " Feb 24 10:23:16 crc kubenswrapper[4698]: I0224 10:23:16.046998 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9c6f37d6-c4a7-48ba-bb37-18128d69c384-client-ca\") pod \"9c6f37d6-c4a7-48ba-bb37-18128d69c384\" (UID: \"9c6f37d6-c4a7-48ba-bb37-18128d69c384\") " Feb 24 10:23:16 crc kubenswrapper[4698]: I0224 10:23:16.047017 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j7qgn\" (UniqueName: \"kubernetes.io/projected/9c6f37d6-c4a7-48ba-bb37-18128d69c384-kube-api-access-j7qgn\") pod \"9c6f37d6-c4a7-48ba-bb37-18128d69c384\" (UID: \"9c6f37d6-c4a7-48ba-bb37-18128d69c384\") " Feb 24 10:23:16 crc kubenswrapper[4698]: I0224 10:23:16.047053 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c6f37d6-c4a7-48ba-bb37-18128d69c384-config\") pod \"9c6f37d6-c4a7-48ba-bb37-18128d69c384\" (UID: \"9c6f37d6-c4a7-48ba-bb37-18128d69c384\") " Feb 24 10:23:16 crc kubenswrapper[4698]: I0224 10:23:16.048108 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c6f37d6-c4a7-48ba-bb37-18128d69c384-client-ca" (OuterVolumeSpecName: "client-ca") pod "9c6f37d6-c4a7-48ba-bb37-18128d69c384" (UID: "9c6f37d6-c4a7-48ba-bb37-18128d69c384"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:23:16 crc kubenswrapper[4698]: I0224 10:23:16.048140 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c6f37d6-c4a7-48ba-bb37-18128d69c384-config" (OuterVolumeSpecName: "config") pod "9c6f37d6-c4a7-48ba-bb37-18128d69c384" (UID: "9c6f37d6-c4a7-48ba-bb37-18128d69c384"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:23:16 crc kubenswrapper[4698]: I0224 10:23:16.052878 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c6f37d6-c4a7-48ba-bb37-18128d69c384-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9c6f37d6-c4a7-48ba-bb37-18128d69c384" (UID: "9c6f37d6-c4a7-48ba-bb37-18128d69c384"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:23:16 crc kubenswrapper[4698]: I0224 10:23:16.054233 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c6f37d6-c4a7-48ba-bb37-18128d69c384-kube-api-access-j7qgn" (OuterVolumeSpecName: "kube-api-access-j7qgn") pod "9c6f37d6-c4a7-48ba-bb37-18128d69c384" (UID: "9c6f37d6-c4a7-48ba-bb37-18128d69c384"). InnerVolumeSpecName "kube-api-access-j7qgn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:23:16 crc kubenswrapper[4698]: I0224 10:23:16.148602 4698 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c6f37d6-c4a7-48ba-bb37-18128d69c384-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 10:23:16 crc kubenswrapper[4698]: I0224 10:23:16.148671 4698 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9c6f37d6-c4a7-48ba-bb37-18128d69c384-client-ca\") on node \"crc\" DevicePath \"\"" Feb 24 10:23:16 crc kubenswrapper[4698]: I0224 10:23:16.148690 4698 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j7qgn\" (UniqueName: \"kubernetes.io/projected/9c6f37d6-c4a7-48ba-bb37-18128d69c384-kube-api-access-j7qgn\") on node \"crc\" DevicePath \"\"" Feb 24 10:23:16 crc kubenswrapper[4698]: I0224 10:23:16.148712 4698 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c6f37d6-c4a7-48ba-bb37-18128d69c384-config\") on node \"crc\" DevicePath \"\"" Feb 24 10:23:16 crc kubenswrapper[4698]: I0224 10:23:16.198077 4698 generic.go:334] "Generic (PLEG): container finished" podID="9c6f37d6-c4a7-48ba-bb37-18128d69c384" containerID="74dd676bd241111954d03937e9d287715e722c778d31ebf2ff942bf5f142d520" exitCode=0 Feb 24 10:23:16 crc kubenswrapper[4698]: I0224 10:23:16.198134 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-69c79dd4cc-qvzmz" event={"ID":"9c6f37d6-c4a7-48ba-bb37-18128d69c384","Type":"ContainerDied","Data":"74dd676bd241111954d03937e9d287715e722c778d31ebf2ff942bf5f142d520"} Feb 24 10:23:16 crc kubenswrapper[4698]: I0224 10:23:16.198186 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-69c79dd4cc-qvzmz" event={"ID":"9c6f37d6-c4a7-48ba-bb37-18128d69c384","Type":"ContainerDied","Data":"b3294df224fed6eab3c6e8416ac26818f60bd123598fc2e558e0096fdec68448"} Feb 24 10:23:16 crc kubenswrapper[4698]: I0224 10:23:16.198173 4698 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-69c79dd4cc-qvzmz" Feb 24 10:23:16 crc kubenswrapper[4698]: I0224 10:23:16.198210 4698 scope.go:117] "RemoveContainer" containerID="74dd676bd241111954d03937e9d287715e722c778d31ebf2ff942bf5f142d520" Feb 24 10:23:16 crc kubenswrapper[4698]: I0224 10:23:16.229794 4698 scope.go:117] "RemoveContainer" containerID="74dd676bd241111954d03937e9d287715e722c778d31ebf2ff942bf5f142d520" Feb 24 10:23:16 crc kubenswrapper[4698]: E0224 10:23:16.231226 4698 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74dd676bd241111954d03937e9d287715e722c778d31ebf2ff942bf5f142d520\": container with ID starting with 74dd676bd241111954d03937e9d287715e722c778d31ebf2ff942bf5f142d520 not found: ID does not exist" containerID="74dd676bd241111954d03937e9d287715e722c778d31ebf2ff942bf5f142d520" Feb 24 10:23:16 crc kubenswrapper[4698]: I0224 10:23:16.231296 4698 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74dd676bd241111954d03937e9d287715e722c778d31ebf2ff942bf5f142d520"} err="failed to get container status \"74dd676bd241111954d03937e9d287715e722c778d31ebf2ff942bf5f142d520\": rpc error: code = NotFound desc = could not find container \"74dd676bd241111954d03937e9d287715e722c778d31ebf2ff942bf5f142d520\": container with ID starting with 74dd676bd241111954d03937e9d287715e722c778d31ebf2ff942bf5f142d520 not found: ID does not exist" Feb 24 10:23:16 crc kubenswrapper[4698]: I0224 10:23:16.249318 4698 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-69c79dd4cc-qvzmz"] Feb 24 10:23:16 crc kubenswrapper[4698]: I0224 10:23:16.256891 4698 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-69c79dd4cc-qvzmz"] Feb 24 10:23:16 crc kubenswrapper[4698]: I0224 10:23:16.903740 4698 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6ddb5cdb58-dk7tz"] Feb 24 10:23:16 crc kubenswrapper[4698]: E0224 10:23:16.904027 4698 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c6f37d6-c4a7-48ba-bb37-18128d69c384" containerName="route-controller-manager" Feb 24 10:23:16 crc kubenswrapper[4698]: I0224 10:23:16.904048 4698 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c6f37d6-c4a7-48ba-bb37-18128d69c384" containerName="route-controller-manager" Feb 24 10:23:16 crc kubenswrapper[4698]: I0224 10:23:16.904209 4698 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c6f37d6-c4a7-48ba-bb37-18128d69c384" containerName="route-controller-manager" Feb 24 10:23:16 crc kubenswrapper[4698]: I0224 10:23:16.904876 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6ddb5cdb58-dk7tz" Feb 24 10:23:16 crc kubenswrapper[4698]: I0224 10:23:16.906663 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 24 10:23:16 crc kubenswrapper[4698]: I0224 10:23:16.907071 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 24 10:23:16 crc kubenswrapper[4698]: I0224 10:23:16.907100 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 24 10:23:16 crc kubenswrapper[4698]: I0224 10:23:16.907180 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 24 10:23:16 crc kubenswrapper[4698]: I0224 10:23:16.907645 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 24 10:23:16 crc kubenswrapper[4698]: I0224 10:23:16.908066 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 24 10:23:16 crc kubenswrapper[4698]: I0224 10:23:16.919338 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6ddb5cdb58-dk7tz"] Feb 24 10:23:17 crc kubenswrapper[4698]: I0224 10:23:17.059198 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cf0f78f0-c632-4d18-9868-be095b5968c4-client-ca\") pod \"route-controller-manager-6ddb5cdb58-dk7tz\" (UID: \"cf0f78f0-c632-4d18-9868-be095b5968c4\") " pod="openshift-route-controller-manager/route-controller-manager-6ddb5cdb58-dk7tz" Feb 24 10:23:17 crc kubenswrapper[4698]: I0224 10:23:17.059358 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf0f78f0-c632-4d18-9868-be095b5968c4-config\") pod \"route-controller-manager-6ddb5cdb58-dk7tz\" (UID: \"cf0f78f0-c632-4d18-9868-be095b5968c4\") " pod="openshift-route-controller-manager/route-controller-manager-6ddb5cdb58-dk7tz" Feb 24 10:23:17 crc kubenswrapper[4698]: I0224 10:23:17.059428 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gckmd\" (UniqueName: \"kubernetes.io/projected/cf0f78f0-c632-4d18-9868-be095b5968c4-kube-api-access-gckmd\") pod \"route-controller-manager-6ddb5cdb58-dk7tz\" (UID: \"cf0f78f0-c632-4d18-9868-be095b5968c4\") " pod="openshift-route-controller-manager/route-controller-manager-6ddb5cdb58-dk7tz" Feb 24 10:23:17 crc kubenswrapper[4698]: I0224 10:23:17.059529 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf0f78f0-c632-4d18-9868-be095b5968c4-serving-cert\") pod \"route-controller-manager-6ddb5cdb58-dk7tz\" (UID: \"cf0f78f0-c632-4d18-9868-be095b5968c4\") " pod="openshift-route-controller-manager/route-controller-manager-6ddb5cdb58-dk7tz" Feb 24 10:23:17 crc kubenswrapper[4698]: I0224 10:23:17.160650 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf0f78f0-c632-4d18-9868-be095b5968c4-serving-cert\") pod \"route-controller-manager-6ddb5cdb58-dk7tz\" (UID: \"cf0f78f0-c632-4d18-9868-be095b5968c4\") " pod="openshift-route-controller-manager/route-controller-manager-6ddb5cdb58-dk7tz" Feb 24 10:23:17 crc kubenswrapper[4698]: I0224 10:23:17.160857 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cf0f78f0-c632-4d18-9868-be095b5968c4-client-ca\") pod \"route-controller-manager-6ddb5cdb58-dk7tz\" (UID: \"cf0f78f0-c632-4d18-9868-be095b5968c4\") " pod="openshift-route-controller-manager/route-controller-manager-6ddb5cdb58-dk7tz" Feb 24 10:23:17 crc kubenswrapper[4698]: I0224 10:23:17.160978 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf0f78f0-c632-4d18-9868-be095b5968c4-config\") pod \"route-controller-manager-6ddb5cdb58-dk7tz\" (UID: \"cf0f78f0-c632-4d18-9868-be095b5968c4\") " pod="openshift-route-controller-manager/route-controller-manager-6ddb5cdb58-dk7tz" Feb 24 10:23:17 crc kubenswrapper[4698]: I0224 10:23:17.161042 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gckmd\" (UniqueName: \"kubernetes.io/projected/cf0f78f0-c632-4d18-9868-be095b5968c4-kube-api-access-gckmd\") pod \"route-controller-manager-6ddb5cdb58-dk7tz\" (UID: \"cf0f78f0-c632-4d18-9868-be095b5968c4\") " pod="openshift-route-controller-manager/route-controller-manager-6ddb5cdb58-dk7tz" Feb 24 10:23:17 crc kubenswrapper[4698]: I0224 10:23:17.162205 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cf0f78f0-c632-4d18-9868-be095b5968c4-client-ca\") pod \"route-controller-manager-6ddb5cdb58-dk7tz\" (UID: \"cf0f78f0-c632-4d18-9868-be095b5968c4\") " pod="openshift-route-controller-manager/route-controller-manager-6ddb5cdb58-dk7tz" Feb 24 10:23:17 crc kubenswrapper[4698]: I0224 10:23:17.163206 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf0f78f0-c632-4d18-9868-be095b5968c4-config\") pod \"route-controller-manager-6ddb5cdb58-dk7tz\" (UID: \"cf0f78f0-c632-4d18-9868-be095b5968c4\") " pod="openshift-route-controller-manager/route-controller-manager-6ddb5cdb58-dk7tz" Feb 24 10:23:17 crc kubenswrapper[4698]: I0224 10:23:17.169616 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf0f78f0-c632-4d18-9868-be095b5968c4-serving-cert\") pod \"route-controller-manager-6ddb5cdb58-dk7tz\" (UID: \"cf0f78f0-c632-4d18-9868-be095b5968c4\") " pod="openshift-route-controller-manager/route-controller-manager-6ddb5cdb58-dk7tz" Feb 24 10:23:17 crc kubenswrapper[4698]: I0224 10:23:17.186921 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gckmd\" (UniqueName: \"kubernetes.io/projected/cf0f78f0-c632-4d18-9868-be095b5968c4-kube-api-access-gckmd\") pod \"route-controller-manager-6ddb5cdb58-dk7tz\" (UID: \"cf0f78f0-c632-4d18-9868-be095b5968c4\") " pod="openshift-route-controller-manager/route-controller-manager-6ddb5cdb58-dk7tz" Feb 24 10:23:17 crc kubenswrapper[4698]: I0224 10:23:17.220520 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6ddb5cdb58-dk7tz" Feb 24 10:23:17 crc kubenswrapper[4698]: I0224 10:23:17.623752 4698 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c6f37d6-c4a7-48ba-bb37-18128d69c384" path="/var/lib/kubelet/pods/9c6f37d6-c4a7-48ba-bb37-18128d69c384/volumes" Feb 24 10:23:17 crc kubenswrapper[4698]: I0224 10:23:17.629882 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6ddb5cdb58-dk7tz"] Feb 24 10:23:17 crc kubenswrapper[4698]: I0224 10:23:17.759807 4698 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-lfxsn" Feb 24 10:23:17 crc kubenswrapper[4698]: I0224 10:23:17.804869 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-lfxsn" Feb 24 10:23:18 crc kubenswrapper[4698]: I0224 10:23:18.021874 4698 scope.go:117] "RemoveContainer" containerID="674ed085a7507742c61fdb7dae4678b08e315a3679788c5dcbb4df97cdc27c61" Feb 24 10:23:18 crc kubenswrapper[4698]: I0224 10:23:18.038750 4698 scope.go:117] "RemoveContainer" containerID="a42a2655047e1fb057b615781d8c2ccf50f62f2a70749ef8bb214d32edaba2b1" Feb 24 10:23:18 crc kubenswrapper[4698]: I0224 10:23:18.217784 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6ddb5cdb58-dk7tz" event={"ID":"cf0f78f0-c632-4d18-9868-be095b5968c4","Type":"ContainerStarted","Data":"9d412b0ad4c780462a2a70f4b0c744a69761433cce99af8630885a46ca84f610"} Feb 24 10:23:18 crc kubenswrapper[4698]: I0224 10:23:18.217833 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6ddb5cdb58-dk7tz" event={"ID":"cf0f78f0-c632-4d18-9868-be095b5968c4","Type":"ContainerStarted","Data":"e584af188cf8371b6e36d97eed37c604a1d327d084760a3dbcd05cdd271ba2db"} Feb 24 10:23:18 crc kubenswrapper[4698]: I0224 10:23:18.218061 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6ddb5cdb58-dk7tz" Feb 24 10:23:18 crc kubenswrapper[4698]: I0224 10:23:18.229097 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6ddb5cdb58-dk7tz" Feb 24 10:23:18 crc kubenswrapper[4698]: I0224 10:23:18.237333 4698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6ddb5cdb58-dk7tz" podStartSLOduration=3.237293111 podStartE2EDuration="3.237293111s" podCreationTimestamp="2026-02-24 10:23:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:23:18.234992335 +0000 UTC m=+423.348606576" watchObservedRunningTime="2026-02-24 10:23:18.237293111 +0000 UTC m=+423.350907392" Feb 24 10:23:18 crc kubenswrapper[4698]: I0224 10:23:18.608152 4698 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-hxxxs" podUID="803e0d1c-f298-49b4-9251-9271f311ee92" containerName="oauth-openshift" containerID="cri-o://efa081e04ddcd841864dd677845a74dfc001ec419317a95d060c9074e704a5f0" gracePeriod=15 Feb 24 10:23:19 crc kubenswrapper[4698]: I0224 10:23:19.148053 4698 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-hxxxs" Feb 24 10:23:19 crc kubenswrapper[4698]: I0224 10:23:19.186524 4698 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-7fcc879758-zvmxk"] Feb 24 10:23:19 crc kubenswrapper[4698]: E0224 10:23:19.186783 4698 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="803e0d1c-f298-49b4-9251-9271f311ee92" containerName="oauth-openshift" Feb 24 10:23:19 crc kubenswrapper[4698]: I0224 10:23:19.186802 4698 state_mem.go:107] "Deleted CPUSet assignment" podUID="803e0d1c-f298-49b4-9251-9271f311ee92" containerName="oauth-openshift" Feb 24 10:23:19 crc kubenswrapper[4698]: I0224 10:23:19.186926 4698 memory_manager.go:354] "RemoveStaleState removing state" podUID="803e0d1c-f298-49b4-9251-9271f311ee92" containerName="oauth-openshift" Feb 24 10:23:19 crc kubenswrapper[4698]: I0224 10:23:19.187389 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7fcc879758-zvmxk" Feb 24 10:23:19 crc kubenswrapper[4698]: I0224 10:23:19.207193 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7fcc879758-zvmxk"] Feb 24 10:23:19 crc kubenswrapper[4698]: I0224 10:23:19.227402 4698 generic.go:334] "Generic (PLEG): container finished" podID="803e0d1c-f298-49b4-9251-9271f311ee92" containerID="efa081e04ddcd841864dd677845a74dfc001ec419317a95d060c9074e704a5f0" exitCode=0 Feb 24 10:23:19 crc kubenswrapper[4698]: I0224 10:23:19.229356 4698 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-hxxxs" Feb 24 10:23:19 crc kubenswrapper[4698]: I0224 10:23:19.229752 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-hxxxs" event={"ID":"803e0d1c-f298-49b4-9251-9271f311ee92","Type":"ContainerDied","Data":"efa081e04ddcd841864dd677845a74dfc001ec419317a95d060c9074e704a5f0"} Feb 24 10:23:19 crc kubenswrapper[4698]: I0224 10:23:19.229794 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-hxxxs" event={"ID":"803e0d1c-f298-49b4-9251-9271f311ee92","Type":"ContainerDied","Data":"d71a1b93b994f47cdc66c06b07c608956dbff0035099963d465fb1cc3d85468e"} Feb 24 10:23:19 crc kubenswrapper[4698]: I0224 10:23:19.229821 4698 scope.go:117] "RemoveContainer" containerID="efa081e04ddcd841864dd677845a74dfc001ec419317a95d060c9074e704a5f0" Feb 24 10:23:19 crc kubenswrapper[4698]: I0224 10:23:19.252206 4698 scope.go:117] "RemoveContainer" containerID="efa081e04ddcd841864dd677845a74dfc001ec419317a95d060c9074e704a5f0" Feb 24 10:23:19 crc kubenswrapper[4698]: E0224 10:23:19.252714 4698 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"efa081e04ddcd841864dd677845a74dfc001ec419317a95d060c9074e704a5f0\": container with ID starting with efa081e04ddcd841864dd677845a74dfc001ec419317a95d060c9074e704a5f0 not found: ID does not exist" containerID="efa081e04ddcd841864dd677845a74dfc001ec419317a95d060c9074e704a5f0" Feb 24 10:23:19 crc kubenswrapper[4698]: I0224 10:23:19.252750 4698 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efa081e04ddcd841864dd677845a74dfc001ec419317a95d060c9074e704a5f0"} err="failed to get container status \"efa081e04ddcd841864dd677845a74dfc001ec419317a95d060c9074e704a5f0\": rpc error: code = NotFound desc = could not find container \"efa081e04ddcd841864dd677845a74dfc001ec419317a95d060c9074e704a5f0\": container with ID starting with efa081e04ddcd841864dd677845a74dfc001ec419317a95d060c9074e704a5f0 not found: ID does not exist" Feb 24 10:23:19 crc kubenswrapper[4698]: I0224 10:23:19.286178 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wkh9l\" (UniqueName: \"kubernetes.io/projected/803e0d1c-f298-49b4-9251-9271f311ee92-kube-api-access-wkh9l\") pod \"803e0d1c-f298-49b4-9251-9271f311ee92\" (UID: \"803e0d1c-f298-49b4-9251-9271f311ee92\") " Feb 24 10:23:19 crc kubenswrapper[4698]: I0224 10:23:19.286218 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/803e0d1c-f298-49b4-9251-9271f311ee92-v4-0-config-system-serving-cert\") pod \"803e0d1c-f298-49b4-9251-9271f311ee92\" (UID: \"803e0d1c-f298-49b4-9251-9271f311ee92\") " Feb 24 10:23:19 crc kubenswrapper[4698]: I0224 10:23:19.286239 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/803e0d1c-f298-49b4-9251-9271f311ee92-v4-0-config-user-template-login\") pod \"803e0d1c-f298-49b4-9251-9271f311ee92\" (UID: \"803e0d1c-f298-49b4-9251-9271f311ee92\") " Feb 24 10:23:19 crc kubenswrapper[4698]: I0224 10:23:19.286276 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/803e0d1c-f298-49b4-9251-9271f311ee92-v4-0-config-user-template-provider-selection\") pod \"803e0d1c-f298-49b4-9251-9271f311ee92\" (UID: \"803e0d1c-f298-49b4-9251-9271f311ee92\") " Feb 24 10:23:19 crc kubenswrapper[4698]: I0224 10:23:19.286305 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/803e0d1c-f298-49b4-9251-9271f311ee92-audit-dir\") pod \"803e0d1c-f298-49b4-9251-9271f311ee92\" (UID: \"803e0d1c-f298-49b4-9251-9271f311ee92\") " Feb 24 10:23:19 crc kubenswrapper[4698]: I0224 10:23:19.286347 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/803e0d1c-f298-49b4-9251-9271f311ee92-v4-0-config-user-idp-0-file-data\") pod \"803e0d1c-f298-49b4-9251-9271f311ee92\" (UID: \"803e0d1c-f298-49b4-9251-9271f311ee92\") " Feb 24 10:23:19 crc kubenswrapper[4698]: I0224 10:23:19.286372 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/803e0d1c-f298-49b4-9251-9271f311ee92-audit-policies\") pod \"803e0d1c-f298-49b4-9251-9271f311ee92\" (UID: \"803e0d1c-f298-49b4-9251-9271f311ee92\") " Feb 24 10:23:19 crc kubenswrapper[4698]: I0224 10:23:19.286393 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/803e0d1c-f298-49b4-9251-9271f311ee92-v4-0-config-user-template-error\") pod \"803e0d1c-f298-49b4-9251-9271f311ee92\" (UID: \"803e0d1c-f298-49b4-9251-9271f311ee92\") " Feb 24 10:23:19 crc kubenswrapper[4698]: I0224 10:23:19.286415 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/803e0d1c-f298-49b4-9251-9271f311ee92-v4-0-config-system-trusted-ca-bundle\") pod \"803e0d1c-f298-49b4-9251-9271f311ee92\" (UID: \"803e0d1c-f298-49b4-9251-9271f311ee92\") " Feb 24 10:23:19 crc kubenswrapper[4698]: I0224 10:23:19.286437 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/803e0d1c-f298-49b4-9251-9271f311ee92-v4-0-config-system-router-certs\") pod \"803e0d1c-f298-49b4-9251-9271f311ee92\" (UID: \"803e0d1c-f298-49b4-9251-9271f311ee92\") " Feb 24 10:23:19 crc kubenswrapper[4698]: I0224 10:23:19.286466 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/803e0d1c-f298-49b4-9251-9271f311ee92-v4-0-config-system-ocp-branding-template\") pod \"803e0d1c-f298-49b4-9251-9271f311ee92\" (UID: \"803e0d1c-f298-49b4-9251-9271f311ee92\") " Feb 24 10:23:19 crc kubenswrapper[4698]: I0224 10:23:19.286488 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/803e0d1c-f298-49b4-9251-9271f311ee92-v4-0-config-system-cliconfig\") pod \"803e0d1c-f298-49b4-9251-9271f311ee92\" (UID: \"803e0d1c-f298-49b4-9251-9271f311ee92\") " Feb 24 10:23:19 crc kubenswrapper[4698]: I0224 10:23:19.286513 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/803e0d1c-f298-49b4-9251-9271f311ee92-v4-0-config-system-service-ca\") pod \"803e0d1c-f298-49b4-9251-9271f311ee92\" (UID: \"803e0d1c-f298-49b4-9251-9271f311ee92\") " Feb 24 10:23:19 crc kubenswrapper[4698]: I0224 10:23:19.286535 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/803e0d1c-f298-49b4-9251-9271f311ee92-v4-0-config-system-session\") pod \"803e0d1c-f298-49b4-9251-9271f311ee92\" (UID: \"803e0d1c-f298-49b4-9251-9271f311ee92\") " Feb 24 10:23:19 crc kubenswrapper[4698]: I0224 10:23:19.286698 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0c4ac853-dd2a-4701-8b35-f1c80d55d694-audit-dir\") pod \"oauth-openshift-7fcc879758-zvmxk\" (UID: \"0c4ac853-dd2a-4701-8b35-f1c80d55d694\") " pod="openshift-authentication/oauth-openshift-7fcc879758-zvmxk" Feb 24 10:23:19 crc kubenswrapper[4698]: I0224 10:23:19.286911 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfhjm\" (UniqueName: \"kubernetes.io/projected/0c4ac853-dd2a-4701-8b35-f1c80d55d694-kube-api-access-hfhjm\") pod \"oauth-openshift-7fcc879758-zvmxk\" (UID: \"0c4ac853-dd2a-4701-8b35-f1c80d55d694\") " pod="openshift-authentication/oauth-openshift-7fcc879758-zvmxk" Feb 24 10:23:19 crc kubenswrapper[4698]: I0224 10:23:19.286944 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/0c4ac853-dd2a-4701-8b35-f1c80d55d694-v4-0-config-system-session\") pod \"oauth-openshift-7fcc879758-zvmxk\" (UID: \"0c4ac853-dd2a-4701-8b35-f1c80d55d694\") " pod="openshift-authentication/oauth-openshift-7fcc879758-zvmxk" Feb 24 10:23:19 crc kubenswrapper[4698]: I0224 10:23:19.286960 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/0c4ac853-dd2a-4701-8b35-f1c80d55d694-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7fcc879758-zvmxk\" (UID: \"0c4ac853-dd2a-4701-8b35-f1c80d55d694\") " pod="openshift-authentication/oauth-openshift-7fcc879758-zvmxk" Feb 24 10:23:19 crc kubenswrapper[4698]: I0224 10:23:19.286984 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0c4ac853-dd2a-4701-8b35-f1c80d55d694-audit-policies\") pod \"oauth-openshift-7fcc879758-zvmxk\" (UID: \"0c4ac853-dd2a-4701-8b35-f1c80d55d694\") " pod="openshift-authentication/oauth-openshift-7fcc879758-zvmxk" Feb 24 10:23:19 crc kubenswrapper[4698]: I0224 10:23:19.287001 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/0c4ac853-dd2a-4701-8b35-f1c80d55d694-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7fcc879758-zvmxk\" (UID: \"0c4ac853-dd2a-4701-8b35-f1c80d55d694\") " pod="openshift-authentication/oauth-openshift-7fcc879758-zvmxk" Feb 24 10:23:19 crc kubenswrapper[4698]: I0224 10:23:19.287019 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/0c4ac853-dd2a-4701-8b35-f1c80d55d694-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7fcc879758-zvmxk\" (UID: \"0c4ac853-dd2a-4701-8b35-f1c80d55d694\") " pod="openshift-authentication/oauth-openshift-7fcc879758-zvmxk" Feb 24 10:23:19 crc kubenswrapper[4698]: I0224 10:23:19.287036 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/0c4ac853-dd2a-4701-8b35-f1c80d55d694-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7fcc879758-zvmxk\" (UID: \"0c4ac853-dd2a-4701-8b35-f1c80d55d694\") " pod="openshift-authentication/oauth-openshift-7fcc879758-zvmxk" Feb 24 10:23:19 crc kubenswrapper[4698]: I0224 10:23:19.287065 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/0c4ac853-dd2a-4701-8b35-f1c80d55d694-v4-0-config-system-service-ca\") pod \"oauth-openshift-7fcc879758-zvmxk\" (UID: \"0c4ac853-dd2a-4701-8b35-f1c80d55d694\") " pod="openshift-authentication/oauth-openshift-7fcc879758-zvmxk" Feb 24 10:23:19 crc kubenswrapper[4698]: I0224 10:23:19.287098 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/0c4ac853-dd2a-4701-8b35-f1c80d55d694-v4-0-config-user-template-login\") pod \"oauth-openshift-7fcc879758-zvmxk\" (UID: \"0c4ac853-dd2a-4701-8b35-f1c80d55d694\") " pod="openshift-authentication/oauth-openshift-7fcc879758-zvmxk" Feb 24 10:23:19 crc kubenswrapper[4698]: I0224 10:23:19.287124 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/0c4ac853-dd2a-4701-8b35-f1c80d55d694-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7fcc879758-zvmxk\" (UID: \"0c4ac853-dd2a-4701-8b35-f1c80d55d694\") " pod="openshift-authentication/oauth-openshift-7fcc879758-zvmxk" Feb 24 10:23:19 crc kubenswrapper[4698]: I0224 10:23:19.287153 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/0c4ac853-dd2a-4701-8b35-f1c80d55d694-v4-0-config-user-template-error\") pod \"oauth-openshift-7fcc879758-zvmxk\" (UID: \"0c4ac853-dd2a-4701-8b35-f1c80d55d694\") " pod="openshift-authentication/oauth-openshift-7fcc879758-zvmxk" Feb 24 10:23:19 crc kubenswrapper[4698]: I0224 10:23:19.287172 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0c4ac853-dd2a-4701-8b35-f1c80d55d694-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7fcc879758-zvmxk\" (UID: \"0c4ac853-dd2a-4701-8b35-f1c80d55d694\") " pod="openshift-authentication/oauth-openshift-7fcc879758-zvmxk" Feb 24 10:23:19 crc kubenswrapper[4698]: I0224 10:23:19.287191 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/0c4ac853-dd2a-4701-8b35-f1c80d55d694-v4-0-config-system-router-certs\") pod \"oauth-openshift-7fcc879758-zvmxk\" (UID: \"0c4ac853-dd2a-4701-8b35-f1c80d55d694\") " pod="openshift-authentication/oauth-openshift-7fcc879758-zvmxk" Feb 24 10:23:19 crc kubenswrapper[4698]: I0224 10:23:19.287590 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/803e0d1c-f298-49b4-9251-9271f311ee92-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "803e0d1c-f298-49b4-9251-9271f311ee92" (UID: "803e0d1c-f298-49b4-9251-9271f311ee92"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 10:23:19 crc kubenswrapper[4698]: I0224 10:23:19.288346 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/803e0d1c-f298-49b4-9251-9271f311ee92-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "803e0d1c-f298-49b4-9251-9271f311ee92" (UID: "803e0d1c-f298-49b4-9251-9271f311ee92"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:23:19 crc kubenswrapper[4698]: I0224 10:23:19.288764 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/803e0d1c-f298-49b4-9251-9271f311ee92-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "803e0d1c-f298-49b4-9251-9271f311ee92" (UID: "803e0d1c-f298-49b4-9251-9271f311ee92"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:23:19 crc kubenswrapper[4698]: I0224 10:23:19.288952 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/803e0d1c-f298-49b4-9251-9271f311ee92-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "803e0d1c-f298-49b4-9251-9271f311ee92" (UID: "803e0d1c-f298-49b4-9251-9271f311ee92"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:23:19 crc kubenswrapper[4698]: I0224 10:23:19.289044 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/803e0d1c-f298-49b4-9251-9271f311ee92-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "803e0d1c-f298-49b4-9251-9271f311ee92" (UID: "803e0d1c-f298-49b4-9251-9271f311ee92"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:23:19 crc kubenswrapper[4698]: I0224 10:23:19.295086 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/803e0d1c-f298-49b4-9251-9271f311ee92-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "803e0d1c-f298-49b4-9251-9271f311ee92" (UID: "803e0d1c-f298-49b4-9251-9271f311ee92"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:23:19 crc kubenswrapper[4698]: I0224 10:23:19.295306 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/803e0d1c-f298-49b4-9251-9271f311ee92-kube-api-access-wkh9l" (OuterVolumeSpecName: "kube-api-access-wkh9l") pod "803e0d1c-f298-49b4-9251-9271f311ee92" (UID: "803e0d1c-f298-49b4-9251-9271f311ee92"). InnerVolumeSpecName "kube-api-access-wkh9l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:23:19 crc kubenswrapper[4698]: I0224 10:23:19.295426 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/803e0d1c-f298-49b4-9251-9271f311ee92-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "803e0d1c-f298-49b4-9251-9271f311ee92" (UID: "803e0d1c-f298-49b4-9251-9271f311ee92"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:23:19 crc kubenswrapper[4698]: I0224 10:23:19.295715 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/803e0d1c-f298-49b4-9251-9271f311ee92-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "803e0d1c-f298-49b4-9251-9271f311ee92" (UID: "803e0d1c-f298-49b4-9251-9271f311ee92"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:23:19 crc kubenswrapper[4698]: I0224 10:23:19.297754 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/803e0d1c-f298-49b4-9251-9271f311ee92-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "803e0d1c-f298-49b4-9251-9271f311ee92" (UID: "803e0d1c-f298-49b4-9251-9271f311ee92"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:23:19 crc kubenswrapper[4698]: I0224 10:23:19.298800 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/803e0d1c-f298-49b4-9251-9271f311ee92-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "803e0d1c-f298-49b4-9251-9271f311ee92" (UID: "803e0d1c-f298-49b4-9251-9271f311ee92"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:23:19 crc kubenswrapper[4698]: I0224 10:23:19.304117 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/803e0d1c-f298-49b4-9251-9271f311ee92-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "803e0d1c-f298-49b4-9251-9271f311ee92" (UID: "803e0d1c-f298-49b4-9251-9271f311ee92"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:23:19 crc kubenswrapper[4698]: I0224 10:23:19.307819 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/803e0d1c-f298-49b4-9251-9271f311ee92-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "803e0d1c-f298-49b4-9251-9271f311ee92" (UID: "803e0d1c-f298-49b4-9251-9271f311ee92"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:23:19 crc kubenswrapper[4698]: I0224 10:23:19.308977 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/803e0d1c-f298-49b4-9251-9271f311ee92-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "803e0d1c-f298-49b4-9251-9271f311ee92" (UID: "803e0d1c-f298-49b4-9251-9271f311ee92"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:23:19 crc kubenswrapper[4698]: I0224 10:23:19.388339 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/0c4ac853-dd2a-4701-8b35-f1c80d55d694-v4-0-config-user-template-login\") pod \"oauth-openshift-7fcc879758-zvmxk\" (UID: \"0c4ac853-dd2a-4701-8b35-f1c80d55d694\") " pod="openshift-authentication/oauth-openshift-7fcc879758-zvmxk" Feb 24 10:23:19 crc kubenswrapper[4698]: I0224 10:23:19.388418 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/0c4ac853-dd2a-4701-8b35-f1c80d55d694-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7fcc879758-zvmxk\" (UID: \"0c4ac853-dd2a-4701-8b35-f1c80d55d694\") " pod="openshift-authentication/oauth-openshift-7fcc879758-zvmxk" Feb 24 10:23:19 crc kubenswrapper[4698]: I0224 10:23:19.388457 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/0c4ac853-dd2a-4701-8b35-f1c80d55d694-v4-0-config-user-template-error\") pod \"oauth-openshift-7fcc879758-zvmxk\" (UID: \"0c4ac853-dd2a-4701-8b35-f1c80d55d694\") " pod="openshift-authentication/oauth-openshift-7fcc879758-zvmxk" Feb 24 10:23:19 crc kubenswrapper[4698]: I0224 10:23:19.388497 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0c4ac853-dd2a-4701-8b35-f1c80d55d694-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7fcc879758-zvmxk\" (UID: \"0c4ac853-dd2a-4701-8b35-f1c80d55d694\") " pod="openshift-authentication/oauth-openshift-7fcc879758-zvmxk" Feb 24 10:23:19 crc kubenswrapper[4698]: I0224 10:23:19.388524 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/0c4ac853-dd2a-4701-8b35-f1c80d55d694-v4-0-config-system-router-certs\") pod \"oauth-openshift-7fcc879758-zvmxk\" (UID: \"0c4ac853-dd2a-4701-8b35-f1c80d55d694\") " pod="openshift-authentication/oauth-openshift-7fcc879758-zvmxk" Feb 24 10:23:19 crc kubenswrapper[4698]: I0224 10:23:19.388562 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0c4ac853-dd2a-4701-8b35-f1c80d55d694-audit-dir\") pod \"oauth-openshift-7fcc879758-zvmxk\" (UID: \"0c4ac853-dd2a-4701-8b35-f1c80d55d694\") " pod="openshift-authentication/oauth-openshift-7fcc879758-zvmxk" Feb 24 10:23:19 crc kubenswrapper[4698]: I0224 10:23:19.388614 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfhjm\" (UniqueName: \"kubernetes.io/projected/0c4ac853-dd2a-4701-8b35-f1c80d55d694-kube-api-access-hfhjm\") pod \"oauth-openshift-7fcc879758-zvmxk\" (UID: \"0c4ac853-dd2a-4701-8b35-f1c80d55d694\") " pod="openshift-authentication/oauth-openshift-7fcc879758-zvmxk" Feb 24 10:23:19 crc kubenswrapper[4698]: I0224 10:23:19.388661 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/0c4ac853-dd2a-4701-8b35-f1c80d55d694-v4-0-config-system-session\") pod \"oauth-openshift-7fcc879758-zvmxk\" (UID: \"0c4ac853-dd2a-4701-8b35-f1c80d55d694\") " pod="openshift-authentication/oauth-openshift-7fcc879758-zvmxk" Feb 24 10:23:19 crc kubenswrapper[4698]: I0224 10:23:19.388694 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/0c4ac853-dd2a-4701-8b35-f1c80d55d694-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7fcc879758-zvmxk\" (UID: \"0c4ac853-dd2a-4701-8b35-f1c80d55d694\") " pod="openshift-authentication/oauth-openshift-7fcc879758-zvmxk" Feb 24 10:23:19 crc kubenswrapper[4698]: I0224 10:23:19.388745 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0c4ac853-dd2a-4701-8b35-f1c80d55d694-audit-policies\") pod \"oauth-openshift-7fcc879758-zvmxk\" (UID: \"0c4ac853-dd2a-4701-8b35-f1c80d55d694\") " pod="openshift-authentication/oauth-openshift-7fcc879758-zvmxk" Feb 24 10:23:19 crc kubenswrapper[4698]: I0224 10:23:19.388779 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/0c4ac853-dd2a-4701-8b35-f1c80d55d694-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7fcc879758-zvmxk\" (UID: \"0c4ac853-dd2a-4701-8b35-f1c80d55d694\") " pod="openshift-authentication/oauth-openshift-7fcc879758-zvmxk" Feb 24 10:23:19 crc kubenswrapper[4698]: I0224 10:23:19.388832 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/0c4ac853-dd2a-4701-8b35-f1c80d55d694-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7fcc879758-zvmxk\" (UID: \"0c4ac853-dd2a-4701-8b35-f1c80d55d694\") " pod="openshift-authentication/oauth-openshift-7fcc879758-zvmxk" Feb 24 10:23:19 crc kubenswrapper[4698]: I0224 10:23:19.388867 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/0c4ac853-dd2a-4701-8b35-f1c80d55d694-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7fcc879758-zvmxk\" (UID: \"0c4ac853-dd2a-4701-8b35-f1c80d55d694\") " pod="openshift-authentication/oauth-openshift-7fcc879758-zvmxk" Feb 24 10:23:19 crc kubenswrapper[4698]: I0224 10:23:19.388933 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/0c4ac853-dd2a-4701-8b35-f1c80d55d694-v4-0-config-system-service-ca\") pod \"oauth-openshift-7fcc879758-zvmxk\" (UID: \"0c4ac853-dd2a-4701-8b35-f1c80d55d694\") " pod="openshift-authentication/oauth-openshift-7fcc879758-zvmxk" Feb 24 10:23:19 crc kubenswrapper[4698]: I0224 10:23:19.389029 4698 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/803e0d1c-f298-49b4-9251-9271f311ee92-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 24 10:23:19 crc kubenswrapper[4698]: I0224 10:23:19.389051 4698 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/803e0d1c-f298-49b4-9251-9271f311ee92-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 24 10:23:19 crc kubenswrapper[4698]: I0224 10:23:19.389073 4698 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wkh9l\" (UniqueName: \"kubernetes.io/projected/803e0d1c-f298-49b4-9251-9271f311ee92-kube-api-access-wkh9l\") on node \"crc\" DevicePath \"\"" Feb 24 10:23:19 crc kubenswrapper[4698]: I0224 10:23:19.389091 4698 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/803e0d1c-f298-49b4-9251-9271f311ee92-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 10:23:19 crc kubenswrapper[4698]: I0224 10:23:19.389113 4698 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/803e0d1c-f298-49b4-9251-9271f311ee92-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 24 10:23:19 crc kubenswrapper[4698]: I0224 10:23:19.389132 4698 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/803e0d1c-f298-49b4-9251-9271f311ee92-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 24 10:23:19 crc kubenswrapper[4698]: I0224 10:23:19.389154 4698 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/803e0d1c-f298-49b4-9251-9271f311ee92-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 24 10:23:19 crc kubenswrapper[4698]: I0224 10:23:19.389556 4698 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/803e0d1c-f298-49b4-9251-9271f311ee92-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 24 10:23:19 crc kubenswrapper[4698]: I0224 10:23:19.389576 4698 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/803e0d1c-f298-49b4-9251-9271f311ee92-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 24 10:23:19 crc kubenswrapper[4698]: I0224 10:23:19.389598 4698 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/803e0d1c-f298-49b4-9251-9271f311ee92-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 24 10:23:19 crc kubenswrapper[4698]: I0224 10:23:19.389617 4698 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/803e0d1c-f298-49b4-9251-9271f311ee92-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 24 10:23:19 crc kubenswrapper[4698]: I0224 10:23:19.389635 4698 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/803e0d1c-f298-49b4-9251-9271f311ee92-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 24 10:23:19 crc kubenswrapper[4698]: I0224 10:23:19.389654 4698 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/803e0d1c-f298-49b4-9251-9271f311ee92-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 24 10:23:19 crc kubenswrapper[4698]: I0224 10:23:19.389672 4698 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/803e0d1c-f298-49b4-9251-9271f311ee92-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 24 10:23:19 crc kubenswrapper[4698]: I0224 10:23:19.389359 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/0c4ac853-dd2a-4701-8b35-f1c80d55d694-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7fcc879758-zvmxk\" (UID: \"0c4ac853-dd2a-4701-8b35-f1c80d55d694\") " pod="openshift-authentication/oauth-openshift-7fcc879758-zvmxk" Feb 24 10:23:19 crc kubenswrapper[4698]: I0224 10:23:19.390230 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0c4ac853-dd2a-4701-8b35-f1c80d55d694-audit-dir\") pod \"oauth-openshift-7fcc879758-zvmxk\" (UID: \"0c4ac853-dd2a-4701-8b35-f1c80d55d694\") " pod="openshift-authentication/oauth-openshift-7fcc879758-zvmxk" Feb 24 10:23:19 crc kubenswrapper[4698]: I0224 10:23:19.391071 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0c4ac853-dd2a-4701-8b35-f1c80d55d694-audit-policies\") pod \"oauth-openshift-7fcc879758-zvmxk\" (UID: \"0c4ac853-dd2a-4701-8b35-f1c80d55d694\") " pod="openshift-authentication/oauth-openshift-7fcc879758-zvmxk" Feb 24 10:23:19 crc kubenswrapper[4698]: I0224 10:23:19.391592 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/0c4ac853-dd2a-4701-8b35-f1c80d55d694-v4-0-config-system-service-ca\") pod \"oauth-openshift-7fcc879758-zvmxk\" (UID: \"0c4ac853-dd2a-4701-8b35-f1c80d55d694\") " pod="openshift-authentication/oauth-openshift-7fcc879758-zvmxk" Feb 24 10:23:19 crc kubenswrapper[4698]: I0224 10:23:19.392634 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0c4ac853-dd2a-4701-8b35-f1c80d55d694-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7fcc879758-zvmxk\" (UID: \"0c4ac853-dd2a-4701-8b35-f1c80d55d694\") " pod="openshift-authentication/oauth-openshift-7fcc879758-zvmxk" Feb 24 10:23:19 crc kubenswrapper[4698]: I0224 10:23:19.394429 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/0c4ac853-dd2a-4701-8b35-f1c80d55d694-v4-0-config-user-template-login\") pod \"oauth-openshift-7fcc879758-zvmxk\" (UID: \"0c4ac853-dd2a-4701-8b35-f1c80d55d694\") " pod="openshift-authentication/oauth-openshift-7fcc879758-zvmxk" Feb 24 10:23:19 crc kubenswrapper[4698]: I0224 10:23:19.394683 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/0c4ac853-dd2a-4701-8b35-f1c80d55d694-v4-0-config-system-router-certs\") pod \"oauth-openshift-7fcc879758-zvmxk\" (UID: \"0c4ac853-dd2a-4701-8b35-f1c80d55d694\") " pod="openshift-authentication/oauth-openshift-7fcc879758-zvmxk" Feb 24 10:23:19 crc kubenswrapper[4698]: I0224 10:23:19.395243 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/0c4ac853-dd2a-4701-8b35-f1c80d55d694-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7fcc879758-zvmxk\" (UID: \"0c4ac853-dd2a-4701-8b35-f1c80d55d694\") " pod="openshift-authentication/oauth-openshift-7fcc879758-zvmxk" Feb 24 10:23:19 crc kubenswrapper[4698]: I0224 10:23:19.396947 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/0c4ac853-dd2a-4701-8b35-f1c80d55d694-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7fcc879758-zvmxk\" (UID: \"0c4ac853-dd2a-4701-8b35-f1c80d55d694\") " pod="openshift-authentication/oauth-openshift-7fcc879758-zvmxk" Feb 24 10:23:19 crc kubenswrapper[4698]: I0224 10:23:19.397806 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/0c4ac853-dd2a-4701-8b35-f1c80d55d694-v4-0-config-user-template-error\") pod \"oauth-openshift-7fcc879758-zvmxk\" (UID: \"0c4ac853-dd2a-4701-8b35-f1c80d55d694\") " pod="openshift-authentication/oauth-openshift-7fcc879758-zvmxk" Feb 24 10:23:19 crc kubenswrapper[4698]: I0224 10:23:19.399512 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/0c4ac853-dd2a-4701-8b35-f1c80d55d694-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7fcc879758-zvmxk\" (UID: \"0c4ac853-dd2a-4701-8b35-f1c80d55d694\") " pod="openshift-authentication/oauth-openshift-7fcc879758-zvmxk" Feb 24 10:23:19 crc kubenswrapper[4698]: I0224 10:23:19.400651 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/0c4ac853-dd2a-4701-8b35-f1c80d55d694-v4-0-config-system-session\") pod \"oauth-openshift-7fcc879758-zvmxk\" (UID: \"0c4ac853-dd2a-4701-8b35-f1c80d55d694\") " pod="openshift-authentication/oauth-openshift-7fcc879758-zvmxk" Feb 24 10:23:19 crc kubenswrapper[4698]: I0224 10:23:19.402061 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/0c4ac853-dd2a-4701-8b35-f1c80d55d694-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7fcc879758-zvmxk\" (UID: \"0c4ac853-dd2a-4701-8b35-f1c80d55d694\") " pod="openshift-authentication/oauth-openshift-7fcc879758-zvmxk" Feb 24 10:23:19 crc kubenswrapper[4698]: I0224 10:23:19.419044 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfhjm\" (UniqueName: \"kubernetes.io/projected/0c4ac853-dd2a-4701-8b35-f1c80d55d694-kube-api-access-hfhjm\") pod \"oauth-openshift-7fcc879758-zvmxk\" (UID: \"0c4ac853-dd2a-4701-8b35-f1c80d55d694\") " pod="openshift-authentication/oauth-openshift-7fcc879758-zvmxk" Feb 24 10:23:19 crc kubenswrapper[4698]: I0224 10:23:19.510592 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7fcc879758-zvmxk" Feb 24 10:23:19 crc kubenswrapper[4698]: I0224 10:23:19.577919 4698 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-hxxxs"] Feb 24 10:23:19 crc kubenswrapper[4698]: I0224 10:23:19.586815 4698 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-hxxxs"] Feb 24 10:23:19 crc kubenswrapper[4698]: I0224 10:23:19.630409 4698 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="803e0d1c-f298-49b4-9251-9271f311ee92" path="/var/lib/kubelet/pods/803e0d1c-f298-49b4-9251-9271f311ee92/volumes" Feb 24 10:23:19 crc kubenswrapper[4698]: I0224 10:23:19.946448 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7fcc879758-zvmxk"] Feb 24 10:23:20 crc kubenswrapper[4698]: I0224 10:23:20.238698 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7fcc879758-zvmxk" event={"ID":"0c4ac853-dd2a-4701-8b35-f1c80d55d694","Type":"ContainerStarted","Data":"0d5c90a3b5c3c6fae67224bafd4451a4432e0207c4fdd228bad2c002981e5ede"} Feb 24 10:23:21 crc kubenswrapper[4698]: I0224 10:23:21.243309 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7fcc879758-zvmxk" event={"ID":"0c4ac853-dd2a-4701-8b35-f1c80d55d694","Type":"ContainerStarted","Data":"0108e0b03c05ae8fa83450c7b5aa5425954bc222eae3ec92b71437ad0e998d6e"} Feb 24 10:23:21 crc kubenswrapper[4698]: I0224 10:23:21.243622 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-7fcc879758-zvmxk" Feb 24 10:23:21 crc kubenswrapper[4698]: I0224 10:23:21.250772 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-7fcc879758-zvmxk" Feb 24 10:23:21 crc kubenswrapper[4698]: I0224 10:23:21.288340 4698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-7fcc879758-zvmxk" podStartSLOduration=28.288313544 podStartE2EDuration="28.288313544s" podCreationTimestamp="2026-02-24 10:22:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:23:21.263996576 +0000 UTC m=+426.377610817" watchObservedRunningTime="2026-02-24 10:23:21.288313544 +0000 UTC m=+426.401927835" Feb 24 10:23:22 crc kubenswrapper[4698]: I0224 10:23:22.197117 4698 patch_prober.go:28] interesting pod/machine-config-daemon-nn578 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 10:23:22 crc kubenswrapper[4698]: I0224 10:23:22.197532 4698 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nn578" podUID="b4ee0bb1-125d-4852-a54d-7dadf6177545" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 10:23:49 crc kubenswrapper[4698]: I0224 10:23:49.421882 4698 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5db558bd57-kg5cl"] Feb 24 10:23:49 crc kubenswrapper[4698]: I0224 10:23:49.424558 4698 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5db558bd57-kg5cl" podUID="626a1acd-66b2-4f6d-a334-f5b445486cc9" containerName="controller-manager" containerID="cri-o://f7ff096415c8a1fba1127e00c59d8e7243a389553c09dc8d8c67b98173eaa28d" gracePeriod=30 Feb 24 10:23:50 crc kubenswrapper[4698]: I0224 10:23:49.845485 4698 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5db558bd57-kg5cl" Feb 24 10:23:50 crc kubenswrapper[4698]: I0224 10:23:49.984393 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/626a1acd-66b2-4f6d-a334-f5b445486cc9-proxy-ca-bundles\") pod \"626a1acd-66b2-4f6d-a334-f5b445486cc9\" (UID: \"626a1acd-66b2-4f6d-a334-f5b445486cc9\") " Feb 24 10:23:50 crc kubenswrapper[4698]: I0224 10:23:49.984429 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/626a1acd-66b2-4f6d-a334-f5b445486cc9-client-ca\") pod \"626a1acd-66b2-4f6d-a334-f5b445486cc9\" (UID: \"626a1acd-66b2-4f6d-a334-f5b445486cc9\") " Feb 24 10:23:50 crc kubenswrapper[4698]: I0224 10:23:49.984478 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/626a1acd-66b2-4f6d-a334-f5b445486cc9-config\") pod \"626a1acd-66b2-4f6d-a334-f5b445486cc9\" (UID: \"626a1acd-66b2-4f6d-a334-f5b445486cc9\") " Feb 24 10:23:50 crc kubenswrapper[4698]: I0224 10:23:49.984506 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/626a1acd-66b2-4f6d-a334-f5b445486cc9-serving-cert\") pod \"626a1acd-66b2-4f6d-a334-f5b445486cc9\" (UID: \"626a1acd-66b2-4f6d-a334-f5b445486cc9\") " Feb 24 10:23:50 crc kubenswrapper[4698]: I0224 10:23:49.984530 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4dgxv\" (UniqueName: \"kubernetes.io/projected/626a1acd-66b2-4f6d-a334-f5b445486cc9-kube-api-access-4dgxv\") pod \"626a1acd-66b2-4f6d-a334-f5b445486cc9\" (UID: \"626a1acd-66b2-4f6d-a334-f5b445486cc9\") " Feb 24 10:23:50 crc kubenswrapper[4698]: I0224 10:23:49.986013 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/626a1acd-66b2-4f6d-a334-f5b445486cc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "626a1acd-66b2-4f6d-a334-f5b445486cc9" (UID: "626a1acd-66b2-4f6d-a334-f5b445486cc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:23:50 crc kubenswrapper[4698]: I0224 10:23:49.986034 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/626a1acd-66b2-4f6d-a334-f5b445486cc9-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "626a1acd-66b2-4f6d-a334-f5b445486cc9" (UID: "626a1acd-66b2-4f6d-a334-f5b445486cc9"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:23:50 crc kubenswrapper[4698]: I0224 10:23:49.986321 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/626a1acd-66b2-4f6d-a334-f5b445486cc9-config" (OuterVolumeSpecName: "config") pod "626a1acd-66b2-4f6d-a334-f5b445486cc9" (UID: "626a1acd-66b2-4f6d-a334-f5b445486cc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:23:50 crc kubenswrapper[4698]: I0224 10:23:49.990034 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/626a1acd-66b2-4f6d-a334-f5b445486cc9-kube-api-access-4dgxv" (OuterVolumeSpecName: "kube-api-access-4dgxv") pod "626a1acd-66b2-4f6d-a334-f5b445486cc9" (UID: "626a1acd-66b2-4f6d-a334-f5b445486cc9"). InnerVolumeSpecName "kube-api-access-4dgxv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:23:50 crc kubenswrapper[4698]: I0224 10:23:49.997245 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/626a1acd-66b2-4f6d-a334-f5b445486cc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "626a1acd-66b2-4f6d-a334-f5b445486cc9" (UID: "626a1acd-66b2-4f6d-a334-f5b445486cc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:23:50 crc kubenswrapper[4698]: I0224 10:23:50.086461 4698 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/626a1acd-66b2-4f6d-a334-f5b445486cc9-config\") on node \"crc\" DevicePath \"\"" Feb 24 10:23:50 crc kubenswrapper[4698]: I0224 10:23:50.086502 4698 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/626a1acd-66b2-4f6d-a334-f5b445486cc9-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 24 10:23:50 crc kubenswrapper[4698]: I0224 10:23:50.086526 4698 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4dgxv\" (UniqueName: \"kubernetes.io/projected/626a1acd-66b2-4f6d-a334-f5b445486cc9-kube-api-access-4dgxv\") on node \"crc\" DevicePath \"\"" Feb 24 10:23:50 crc kubenswrapper[4698]: I0224 10:23:50.086548 4698 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/626a1acd-66b2-4f6d-a334-f5b445486cc9-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 24 10:23:50 crc kubenswrapper[4698]: I0224 10:23:50.086575 4698 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/626a1acd-66b2-4f6d-a334-f5b445486cc9-client-ca\") on node \"crc\" DevicePath \"\"" Feb 24 10:23:50 crc kubenswrapper[4698]: I0224 10:23:50.438606 4698 generic.go:334] "Generic (PLEG): container finished" podID="626a1acd-66b2-4f6d-a334-f5b445486cc9" containerID="f7ff096415c8a1fba1127e00c59d8e7243a389553c09dc8d8c67b98173eaa28d" exitCode=0 Feb 24 10:23:50 crc kubenswrapper[4698]: I0224 10:23:50.439042 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5db558bd57-kg5cl" event={"ID":"626a1acd-66b2-4f6d-a334-f5b445486cc9","Type":"ContainerDied","Data":"f7ff096415c8a1fba1127e00c59d8e7243a389553c09dc8d8c67b98173eaa28d"} Feb 24 10:23:50 crc kubenswrapper[4698]: I0224 10:23:50.439691 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5db558bd57-kg5cl" event={"ID":"626a1acd-66b2-4f6d-a334-f5b445486cc9","Type":"ContainerDied","Data":"a85cd983fee1634d0f8877013d2c6f942eb40b0d8f31bb643a6ada08cd82a796"} Feb 24 10:23:50 crc kubenswrapper[4698]: I0224 10:23:50.439790 4698 scope.go:117] "RemoveContainer" containerID="f7ff096415c8a1fba1127e00c59d8e7243a389553c09dc8d8c67b98173eaa28d" Feb 24 10:23:50 crc kubenswrapper[4698]: I0224 10:23:50.439158 4698 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5db558bd57-kg5cl" Feb 24 10:23:50 crc kubenswrapper[4698]: I0224 10:23:50.460349 4698 scope.go:117] "RemoveContainer" containerID="f7ff096415c8a1fba1127e00c59d8e7243a389553c09dc8d8c67b98173eaa28d" Feb 24 10:23:50 crc kubenswrapper[4698]: E0224 10:23:50.461037 4698 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7ff096415c8a1fba1127e00c59d8e7243a389553c09dc8d8c67b98173eaa28d\": container with ID starting with f7ff096415c8a1fba1127e00c59d8e7243a389553c09dc8d8c67b98173eaa28d not found: ID does not exist" containerID="f7ff096415c8a1fba1127e00c59d8e7243a389553c09dc8d8c67b98173eaa28d" Feb 24 10:23:50 crc kubenswrapper[4698]: I0224 10:23:50.461099 4698 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7ff096415c8a1fba1127e00c59d8e7243a389553c09dc8d8c67b98173eaa28d"} err="failed to get container status \"f7ff096415c8a1fba1127e00c59d8e7243a389553c09dc8d8c67b98173eaa28d\": rpc error: code = NotFound desc = could not find container \"f7ff096415c8a1fba1127e00c59d8e7243a389553c09dc8d8c67b98173eaa28d\": container with ID starting with f7ff096415c8a1fba1127e00c59d8e7243a389553c09dc8d8c67b98173eaa28d not found: ID does not exist" Feb 24 10:23:50 crc kubenswrapper[4698]: I0224 10:23:50.482639 4698 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5db558bd57-kg5cl"] Feb 24 10:23:50 crc kubenswrapper[4698]: I0224 10:23:50.484863 4698 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5db558bd57-kg5cl"] Feb 24 10:23:50 crc kubenswrapper[4698]: I0224 10:23:50.939013 4698 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-59cb84db4b-8lr55"] Feb 24 10:23:50 crc kubenswrapper[4698]: E0224 10:23:50.939249 4698 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="626a1acd-66b2-4f6d-a334-f5b445486cc9" containerName="controller-manager" Feb 24 10:23:50 crc kubenswrapper[4698]: I0224 10:23:50.939286 4698 state_mem.go:107] "Deleted CPUSet assignment" podUID="626a1acd-66b2-4f6d-a334-f5b445486cc9" containerName="controller-manager" Feb 24 10:23:50 crc kubenswrapper[4698]: I0224 10:23:50.939409 4698 memory_manager.go:354] "RemoveStaleState removing state" podUID="626a1acd-66b2-4f6d-a334-f5b445486cc9" containerName="controller-manager" Feb 24 10:23:50 crc kubenswrapper[4698]: I0224 10:23:50.939783 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-59cb84db4b-8lr55" Feb 24 10:23:50 crc kubenswrapper[4698]: I0224 10:23:50.941708 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 24 10:23:50 crc kubenswrapper[4698]: I0224 10:23:50.942601 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 24 10:23:50 crc kubenswrapper[4698]: I0224 10:23:50.942848 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 24 10:23:50 crc kubenswrapper[4698]: I0224 10:23:50.943027 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 24 10:23:50 crc kubenswrapper[4698]: I0224 10:23:50.943308 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 24 10:23:50 crc kubenswrapper[4698]: I0224 10:23:50.943460 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 24 10:23:50 crc kubenswrapper[4698]: I0224 10:23:50.948758 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 24 10:23:50 crc kubenswrapper[4698]: I0224 10:23:50.954476 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-59cb84db4b-8lr55"] Feb 24 10:23:50 crc kubenswrapper[4698]: I0224 10:23:50.997227 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/56dd0dbd-d9b8-46a2-8987-6858aa326d80-client-ca\") pod \"controller-manager-59cb84db4b-8lr55\" (UID: \"56dd0dbd-d9b8-46a2-8987-6858aa326d80\") " pod="openshift-controller-manager/controller-manager-59cb84db4b-8lr55" Feb 24 10:23:50 crc kubenswrapper[4698]: I0224 10:23:50.997357 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/56dd0dbd-d9b8-46a2-8987-6858aa326d80-proxy-ca-bundles\") pod \"controller-manager-59cb84db4b-8lr55\" (UID: \"56dd0dbd-d9b8-46a2-8987-6858aa326d80\") " pod="openshift-controller-manager/controller-manager-59cb84db4b-8lr55" Feb 24 10:23:50 crc kubenswrapper[4698]: I0224 10:23:50.997442 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56dd0dbd-d9b8-46a2-8987-6858aa326d80-config\") pod \"controller-manager-59cb84db4b-8lr55\" (UID: \"56dd0dbd-d9b8-46a2-8987-6858aa326d80\") " pod="openshift-controller-manager/controller-manager-59cb84db4b-8lr55" Feb 24 10:23:50 crc kubenswrapper[4698]: I0224 10:23:50.997677 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/56dd0dbd-d9b8-46a2-8987-6858aa326d80-serving-cert\") pod \"controller-manager-59cb84db4b-8lr55\" (UID: \"56dd0dbd-d9b8-46a2-8987-6858aa326d80\") " pod="openshift-controller-manager/controller-manager-59cb84db4b-8lr55" Feb 24 10:23:50 crc kubenswrapper[4698]: I0224 10:23:50.997828 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8jhp\" (UniqueName: \"kubernetes.io/projected/56dd0dbd-d9b8-46a2-8987-6858aa326d80-kube-api-access-f8jhp\") pod \"controller-manager-59cb84db4b-8lr55\" (UID: \"56dd0dbd-d9b8-46a2-8987-6858aa326d80\") " pod="openshift-controller-manager/controller-manager-59cb84db4b-8lr55" Feb 24 10:23:51 crc kubenswrapper[4698]: I0224 10:23:51.099639 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/56dd0dbd-d9b8-46a2-8987-6858aa326d80-client-ca\") pod \"controller-manager-59cb84db4b-8lr55\" (UID: \"56dd0dbd-d9b8-46a2-8987-6858aa326d80\") " pod="openshift-controller-manager/controller-manager-59cb84db4b-8lr55" Feb 24 10:23:51 crc kubenswrapper[4698]: I0224 10:23:51.099727 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/56dd0dbd-d9b8-46a2-8987-6858aa326d80-proxy-ca-bundles\") pod \"controller-manager-59cb84db4b-8lr55\" (UID: \"56dd0dbd-d9b8-46a2-8987-6858aa326d80\") " pod="openshift-controller-manager/controller-manager-59cb84db4b-8lr55" Feb 24 10:23:51 crc kubenswrapper[4698]: I0224 10:23:51.099770 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56dd0dbd-d9b8-46a2-8987-6858aa326d80-config\") pod \"controller-manager-59cb84db4b-8lr55\" (UID: \"56dd0dbd-d9b8-46a2-8987-6858aa326d80\") " pod="openshift-controller-manager/controller-manager-59cb84db4b-8lr55" Feb 24 10:23:51 crc kubenswrapper[4698]: I0224 10:23:51.099861 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/56dd0dbd-d9b8-46a2-8987-6858aa326d80-serving-cert\") pod \"controller-manager-59cb84db4b-8lr55\" (UID: \"56dd0dbd-d9b8-46a2-8987-6858aa326d80\") " pod="openshift-controller-manager/controller-manager-59cb84db4b-8lr55" Feb 24 10:23:51 crc kubenswrapper[4698]: I0224 10:23:51.099910 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8jhp\" (UniqueName: \"kubernetes.io/projected/56dd0dbd-d9b8-46a2-8987-6858aa326d80-kube-api-access-f8jhp\") pod \"controller-manager-59cb84db4b-8lr55\" (UID: \"56dd0dbd-d9b8-46a2-8987-6858aa326d80\") " pod="openshift-controller-manager/controller-manager-59cb84db4b-8lr55" Feb 24 10:23:51 crc kubenswrapper[4698]: I0224 10:23:51.101344 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/56dd0dbd-d9b8-46a2-8987-6858aa326d80-proxy-ca-bundles\") pod \"controller-manager-59cb84db4b-8lr55\" (UID: \"56dd0dbd-d9b8-46a2-8987-6858aa326d80\") " pod="openshift-controller-manager/controller-manager-59cb84db4b-8lr55" Feb 24 10:23:51 crc kubenswrapper[4698]: I0224 10:23:51.101646 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/56dd0dbd-d9b8-46a2-8987-6858aa326d80-client-ca\") pod \"controller-manager-59cb84db4b-8lr55\" (UID: \"56dd0dbd-d9b8-46a2-8987-6858aa326d80\") " pod="openshift-controller-manager/controller-manager-59cb84db4b-8lr55" Feb 24 10:23:51 crc kubenswrapper[4698]: I0224 10:23:51.103401 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56dd0dbd-d9b8-46a2-8987-6858aa326d80-config\") pod \"controller-manager-59cb84db4b-8lr55\" (UID: \"56dd0dbd-d9b8-46a2-8987-6858aa326d80\") " pod="openshift-controller-manager/controller-manager-59cb84db4b-8lr55" Feb 24 10:23:51 crc kubenswrapper[4698]: I0224 10:23:51.106665 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/56dd0dbd-d9b8-46a2-8987-6858aa326d80-serving-cert\") pod \"controller-manager-59cb84db4b-8lr55\" (UID: \"56dd0dbd-d9b8-46a2-8987-6858aa326d80\") " pod="openshift-controller-manager/controller-manager-59cb84db4b-8lr55" Feb 24 10:23:51 crc kubenswrapper[4698]: I0224 10:23:51.122109 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8jhp\" (UniqueName: \"kubernetes.io/projected/56dd0dbd-d9b8-46a2-8987-6858aa326d80-kube-api-access-f8jhp\") pod \"controller-manager-59cb84db4b-8lr55\" (UID: \"56dd0dbd-d9b8-46a2-8987-6858aa326d80\") " pod="openshift-controller-manager/controller-manager-59cb84db4b-8lr55" Feb 24 10:23:51 crc kubenswrapper[4698]: I0224 10:23:51.260802 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-59cb84db4b-8lr55" Feb 24 10:23:51 crc kubenswrapper[4698]: I0224 10:23:51.627298 4698 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="626a1acd-66b2-4f6d-a334-f5b445486cc9" path="/var/lib/kubelet/pods/626a1acd-66b2-4f6d-a334-f5b445486cc9/volumes" Feb 24 10:23:51 crc kubenswrapper[4698]: I0224 10:23:51.708617 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-59cb84db4b-8lr55"] Feb 24 10:23:52 crc kubenswrapper[4698]: I0224 10:23:52.197073 4698 patch_prober.go:28] interesting pod/machine-config-daemon-nn578 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 10:23:52 crc kubenswrapper[4698]: I0224 10:23:52.197123 4698 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nn578" podUID="b4ee0bb1-125d-4852-a54d-7dadf6177545" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 10:23:52 crc kubenswrapper[4698]: I0224 10:23:52.455474 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-59cb84db4b-8lr55" event={"ID":"56dd0dbd-d9b8-46a2-8987-6858aa326d80","Type":"ContainerStarted","Data":"068ace6e1223d7affe085c65e47f7c472d8bf8c1871e8405d9d04320721da0f7"} Feb 24 10:23:52 crc kubenswrapper[4698]: I0224 10:23:52.455539 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-59cb84db4b-8lr55" event={"ID":"56dd0dbd-d9b8-46a2-8987-6858aa326d80","Type":"ContainerStarted","Data":"81bd85921ce9d8222edcafff41971f708bd0591da16405e8d6df619a57ae1444"} Feb 24 10:23:52 crc kubenswrapper[4698]: I0224 10:23:52.455961 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-59cb84db4b-8lr55" Feb 24 10:23:52 crc kubenswrapper[4698]: I0224 10:23:52.463734 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-59cb84db4b-8lr55" Feb 24 10:23:52 crc kubenswrapper[4698]: I0224 10:23:52.480155 4698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-59cb84db4b-8lr55" podStartSLOduration=3.480134141 podStartE2EDuration="3.480134141s" podCreationTimestamp="2026-02-24 10:23:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:23:52.471668801 +0000 UTC m=+457.585283072" watchObservedRunningTime="2026-02-24 10:23:52.480134141 +0000 UTC m=+457.593748402" Feb 24 10:24:02 crc kubenswrapper[4698]: I0224 10:24:02.101546 4698 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-lj5v8"] Feb 24 10:24:02 crc kubenswrapper[4698]: I0224 10:24:02.102639 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-lj5v8" Feb 24 10:24:02 crc kubenswrapper[4698]: I0224 10:24:02.113961 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-lj5v8"] Feb 24 10:24:02 crc kubenswrapper[4698]: I0224 10:24:02.176128 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0267dc36-e433-4b8d-b89f-226ccdae88a2-installation-pull-secrets\") pod \"image-registry-66df7c8f76-lj5v8\" (UID: \"0267dc36-e433-4b8d-b89f-226ccdae88a2\") " pod="openshift-image-registry/image-registry-66df7c8f76-lj5v8" Feb 24 10:24:02 crc kubenswrapper[4698]: I0224 10:24:02.176183 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0267dc36-e433-4b8d-b89f-226ccdae88a2-trusted-ca\") pod \"image-registry-66df7c8f76-lj5v8\" (UID: \"0267dc36-e433-4b8d-b89f-226ccdae88a2\") " pod="openshift-image-registry/image-registry-66df7c8f76-lj5v8" Feb 24 10:24:02 crc kubenswrapper[4698]: I0224 10:24:02.176220 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-lj5v8\" (UID: \"0267dc36-e433-4b8d-b89f-226ccdae88a2\") " pod="openshift-image-registry/image-registry-66df7c8f76-lj5v8" Feb 24 10:24:02 crc kubenswrapper[4698]: I0224 10:24:02.176242 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wp5d\" (UniqueName: \"kubernetes.io/projected/0267dc36-e433-4b8d-b89f-226ccdae88a2-kube-api-access-9wp5d\") pod \"image-registry-66df7c8f76-lj5v8\" (UID: \"0267dc36-e433-4b8d-b89f-226ccdae88a2\") " pod="openshift-image-registry/image-registry-66df7c8f76-lj5v8" Feb 24 10:24:02 crc kubenswrapper[4698]: I0224 10:24:02.176274 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0267dc36-e433-4b8d-b89f-226ccdae88a2-registry-certificates\") pod \"image-registry-66df7c8f76-lj5v8\" (UID: \"0267dc36-e433-4b8d-b89f-226ccdae88a2\") " pod="openshift-image-registry/image-registry-66df7c8f76-lj5v8" Feb 24 10:24:02 crc kubenswrapper[4698]: I0224 10:24:02.176312 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0267dc36-e433-4b8d-b89f-226ccdae88a2-ca-trust-extracted\") pod \"image-registry-66df7c8f76-lj5v8\" (UID: \"0267dc36-e433-4b8d-b89f-226ccdae88a2\") " pod="openshift-image-registry/image-registry-66df7c8f76-lj5v8" Feb 24 10:24:02 crc kubenswrapper[4698]: I0224 10:24:02.176331 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0267dc36-e433-4b8d-b89f-226ccdae88a2-registry-tls\") pod \"image-registry-66df7c8f76-lj5v8\" (UID: \"0267dc36-e433-4b8d-b89f-226ccdae88a2\") " pod="openshift-image-registry/image-registry-66df7c8f76-lj5v8" Feb 24 10:24:02 crc kubenswrapper[4698]: I0224 10:24:02.176351 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0267dc36-e433-4b8d-b89f-226ccdae88a2-bound-sa-token\") pod \"image-registry-66df7c8f76-lj5v8\" (UID: \"0267dc36-e433-4b8d-b89f-226ccdae88a2\") " pod="openshift-image-registry/image-registry-66df7c8f76-lj5v8" Feb 24 10:24:02 crc kubenswrapper[4698]: I0224 10:24:02.201656 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-lj5v8\" (UID: \"0267dc36-e433-4b8d-b89f-226ccdae88a2\") " pod="openshift-image-registry/image-registry-66df7c8f76-lj5v8" Feb 24 10:24:02 crc kubenswrapper[4698]: I0224 10:24:02.277452 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wp5d\" (UniqueName: \"kubernetes.io/projected/0267dc36-e433-4b8d-b89f-226ccdae88a2-kube-api-access-9wp5d\") pod \"image-registry-66df7c8f76-lj5v8\" (UID: \"0267dc36-e433-4b8d-b89f-226ccdae88a2\") " pod="openshift-image-registry/image-registry-66df7c8f76-lj5v8" Feb 24 10:24:02 crc kubenswrapper[4698]: I0224 10:24:02.277501 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0267dc36-e433-4b8d-b89f-226ccdae88a2-registry-certificates\") pod \"image-registry-66df7c8f76-lj5v8\" (UID: \"0267dc36-e433-4b8d-b89f-226ccdae88a2\") " pod="openshift-image-registry/image-registry-66df7c8f76-lj5v8" Feb 24 10:24:02 crc kubenswrapper[4698]: I0224 10:24:02.277554 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0267dc36-e433-4b8d-b89f-226ccdae88a2-ca-trust-extracted\") pod \"image-registry-66df7c8f76-lj5v8\" (UID: \"0267dc36-e433-4b8d-b89f-226ccdae88a2\") " pod="openshift-image-registry/image-registry-66df7c8f76-lj5v8" Feb 24 10:24:02 crc kubenswrapper[4698]: I0224 10:24:02.277584 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0267dc36-e433-4b8d-b89f-226ccdae88a2-registry-tls\") pod \"image-registry-66df7c8f76-lj5v8\" (UID: \"0267dc36-e433-4b8d-b89f-226ccdae88a2\") " pod="openshift-image-registry/image-registry-66df7c8f76-lj5v8" Feb 24 10:24:02 crc kubenswrapper[4698]: I0224 10:24:02.277621 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0267dc36-e433-4b8d-b89f-226ccdae88a2-bound-sa-token\") pod \"image-registry-66df7c8f76-lj5v8\" (UID: \"0267dc36-e433-4b8d-b89f-226ccdae88a2\") " pod="openshift-image-registry/image-registry-66df7c8f76-lj5v8" Feb 24 10:24:02 crc kubenswrapper[4698]: I0224 10:24:02.277651 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0267dc36-e433-4b8d-b89f-226ccdae88a2-installation-pull-secrets\") pod \"image-registry-66df7c8f76-lj5v8\" (UID: \"0267dc36-e433-4b8d-b89f-226ccdae88a2\") " pod="openshift-image-registry/image-registry-66df7c8f76-lj5v8" Feb 24 10:24:02 crc kubenswrapper[4698]: I0224 10:24:02.277670 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0267dc36-e433-4b8d-b89f-226ccdae88a2-trusted-ca\") pod \"image-registry-66df7c8f76-lj5v8\" (UID: \"0267dc36-e433-4b8d-b89f-226ccdae88a2\") " pod="openshift-image-registry/image-registry-66df7c8f76-lj5v8" Feb 24 10:24:02 crc kubenswrapper[4698]: I0224 10:24:02.279038 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0267dc36-e433-4b8d-b89f-226ccdae88a2-ca-trust-extracted\") pod \"image-registry-66df7c8f76-lj5v8\" (UID: \"0267dc36-e433-4b8d-b89f-226ccdae88a2\") " pod="openshift-image-registry/image-registry-66df7c8f76-lj5v8" Feb 24 10:24:02 crc kubenswrapper[4698]: I0224 10:24:02.280030 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0267dc36-e433-4b8d-b89f-226ccdae88a2-registry-certificates\") pod \"image-registry-66df7c8f76-lj5v8\" (UID: \"0267dc36-e433-4b8d-b89f-226ccdae88a2\") " pod="openshift-image-registry/image-registry-66df7c8f76-lj5v8" Feb 24 10:24:02 crc kubenswrapper[4698]: I0224 10:24:02.280566 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0267dc36-e433-4b8d-b89f-226ccdae88a2-trusted-ca\") pod \"image-registry-66df7c8f76-lj5v8\" (UID: \"0267dc36-e433-4b8d-b89f-226ccdae88a2\") " pod="openshift-image-registry/image-registry-66df7c8f76-lj5v8" Feb 24 10:24:02 crc kubenswrapper[4698]: I0224 10:24:02.287171 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0267dc36-e433-4b8d-b89f-226ccdae88a2-installation-pull-secrets\") pod \"image-registry-66df7c8f76-lj5v8\" (UID: \"0267dc36-e433-4b8d-b89f-226ccdae88a2\") " pod="openshift-image-registry/image-registry-66df7c8f76-lj5v8" Feb 24 10:24:02 crc kubenswrapper[4698]: I0224 10:24:02.301237 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0267dc36-e433-4b8d-b89f-226ccdae88a2-registry-tls\") pod \"image-registry-66df7c8f76-lj5v8\" (UID: \"0267dc36-e433-4b8d-b89f-226ccdae88a2\") " pod="openshift-image-registry/image-registry-66df7c8f76-lj5v8" Feb 24 10:24:02 crc kubenswrapper[4698]: I0224 10:24:02.301553 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0267dc36-e433-4b8d-b89f-226ccdae88a2-bound-sa-token\") pod \"image-registry-66df7c8f76-lj5v8\" (UID: \"0267dc36-e433-4b8d-b89f-226ccdae88a2\") " pod="openshift-image-registry/image-registry-66df7c8f76-lj5v8" Feb 24 10:24:02 crc kubenswrapper[4698]: I0224 10:24:02.303020 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wp5d\" (UniqueName: \"kubernetes.io/projected/0267dc36-e433-4b8d-b89f-226ccdae88a2-kube-api-access-9wp5d\") pod \"image-registry-66df7c8f76-lj5v8\" (UID: \"0267dc36-e433-4b8d-b89f-226ccdae88a2\") " pod="openshift-image-registry/image-registry-66df7c8f76-lj5v8" Feb 24 10:24:02 crc kubenswrapper[4698]: I0224 10:24:02.417497 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-lj5v8" Feb 24 10:24:02 crc kubenswrapper[4698]: I0224 10:24:02.869185 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-lj5v8"] Feb 24 10:24:03 crc kubenswrapper[4698]: I0224 10:24:03.527975 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-lj5v8" event={"ID":"0267dc36-e433-4b8d-b89f-226ccdae88a2","Type":"ContainerStarted","Data":"478cc2e215858f94a1823820a46bc9d7cb94cbf5ae8581334ef2b63c74f679b9"} Feb 24 10:24:03 crc kubenswrapper[4698]: I0224 10:24:03.528034 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-lj5v8" event={"ID":"0267dc36-e433-4b8d-b89f-226ccdae88a2","Type":"ContainerStarted","Data":"01a384cf39ae5c32a48ba8685494c5be7b7c4dcf99e8e23353b73ad3267dc1d8"} Feb 24 10:24:03 crc kubenswrapper[4698]: I0224 10:24:03.528159 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-lj5v8" Feb 24 10:24:03 crc kubenswrapper[4698]: I0224 10:24:03.548121 4698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-lj5v8" podStartSLOduration=1.548103338 podStartE2EDuration="1.548103338s" podCreationTimestamp="2026-02-24 10:24:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:24:03.546332662 +0000 UTC m=+468.659946913" watchObservedRunningTime="2026-02-24 10:24:03.548103338 +0000 UTC m=+468.661717589" Feb 24 10:24:18 crc kubenswrapper[4698]: I0224 10:24:18.247251 4698 scope.go:117] "RemoveContainer" containerID="fa3d4a95fd60ff55d1850deb923135ed607172e7676a141a5d52e6cdd60b23bc" Feb 24 10:24:22 crc kubenswrapper[4698]: I0224 10:24:22.197558 4698 patch_prober.go:28] interesting pod/machine-config-daemon-nn578 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 10:24:22 crc kubenswrapper[4698]: I0224 10:24:22.197883 4698 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nn578" podUID="b4ee0bb1-125d-4852-a54d-7dadf6177545" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 10:24:22 crc kubenswrapper[4698]: I0224 10:24:22.197935 4698 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-nn578" Feb 24 10:24:22 crc kubenswrapper[4698]: I0224 10:24:22.198439 4698 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c26c2143394059f82ffa50e03f99ae3948741b5030a14c47db3d70836dce763e"} pod="openshift-machine-config-operator/machine-config-daemon-nn578" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 24 10:24:22 crc kubenswrapper[4698]: I0224 10:24:22.198499 4698 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-nn578" podUID="b4ee0bb1-125d-4852-a54d-7dadf6177545" containerName="machine-config-daemon" containerID="cri-o://c26c2143394059f82ffa50e03f99ae3948741b5030a14c47db3d70836dce763e" gracePeriod=600 Feb 24 10:24:22 crc kubenswrapper[4698]: I0224 10:24:22.424017 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-lj5v8" Feb 24 10:24:22 crc kubenswrapper[4698]: I0224 10:24:22.489966 4698 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-bmr2l"] Feb 24 10:24:22 crc kubenswrapper[4698]: I0224 10:24:22.642658 4698 generic.go:334] "Generic (PLEG): container finished" podID="b4ee0bb1-125d-4852-a54d-7dadf6177545" containerID="c26c2143394059f82ffa50e03f99ae3948741b5030a14c47db3d70836dce763e" exitCode=0 Feb 24 10:24:22 crc kubenswrapper[4698]: I0224 10:24:22.642702 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-nn578" event={"ID":"b4ee0bb1-125d-4852-a54d-7dadf6177545","Type":"ContainerDied","Data":"c26c2143394059f82ffa50e03f99ae3948741b5030a14c47db3d70836dce763e"} Feb 24 10:24:22 crc kubenswrapper[4698]: I0224 10:24:22.642736 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-nn578" event={"ID":"b4ee0bb1-125d-4852-a54d-7dadf6177545","Type":"ContainerStarted","Data":"0b7da0d5fae2f1471fcf65125ad5cf893f00a676ecd1a2c2a431023ddbdfc83e"} Feb 24 10:24:22 crc kubenswrapper[4698]: I0224 10:24:22.642758 4698 scope.go:117] "RemoveContainer" containerID="a0c8bc2bc5ebfb2472863808bf33f95f8aa74ed45b546ed1a1b3be4883af700e" Feb 24 10:24:47 crc kubenswrapper[4698]: I0224 10:24:47.542538 4698 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-bmr2l" podUID="9ded6944-ff06-4cd5-beef-4dbb3cb9aba8" containerName="registry" containerID="cri-o://dbe5d65c1a1d317becc5cb56edfb4174efe2e60d6ae5614685e4bda391a02b7f" gracePeriod=30 Feb 24 10:24:47 crc kubenswrapper[4698]: I0224 10:24:47.794144 4698 generic.go:334] "Generic (PLEG): container finished" podID="9ded6944-ff06-4cd5-beef-4dbb3cb9aba8" containerID="dbe5d65c1a1d317becc5cb56edfb4174efe2e60d6ae5614685e4bda391a02b7f" exitCode=0 Feb 24 10:24:47 crc kubenswrapper[4698]: I0224 10:24:47.794281 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-bmr2l" event={"ID":"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8","Type":"ContainerDied","Data":"dbe5d65c1a1d317becc5cb56edfb4174efe2e60d6ae5614685e4bda391a02b7f"} Feb 24 10:24:48 crc kubenswrapper[4698]: I0224 10:24:48.042062 4698 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-bmr2l" Feb 24 10:24:48 crc kubenswrapper[4698]: I0224 10:24:48.156153 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9ded6944-ff06-4cd5-beef-4dbb3cb9aba8-bound-sa-token\") pod \"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8\" (UID: \"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8\") " Feb 24 10:24:48 crc kubenswrapper[4698]: I0224 10:24:48.156281 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9ded6944-ff06-4cd5-beef-4dbb3cb9aba8-trusted-ca\") pod \"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8\" (UID: \"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8\") " Feb 24 10:24:48 crc kubenswrapper[4698]: I0224 10:24:48.156347 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9ded6944-ff06-4cd5-beef-4dbb3cb9aba8-registry-certificates\") pod \"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8\" (UID: \"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8\") " Feb 24 10:24:48 crc kubenswrapper[4698]: I0224 10:24:48.156398 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5wzkz\" (UniqueName: \"kubernetes.io/projected/9ded6944-ff06-4cd5-beef-4dbb3cb9aba8-kube-api-access-5wzkz\") pod \"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8\" (UID: \"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8\") " Feb 24 10:24:48 crc kubenswrapper[4698]: I0224 10:24:48.156421 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9ded6944-ff06-4cd5-beef-4dbb3cb9aba8-installation-pull-secrets\") pod \"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8\" (UID: \"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8\") " Feb 24 10:24:48 crc kubenswrapper[4698]: I0224 10:24:48.156454 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9ded6944-ff06-4cd5-beef-4dbb3cb9aba8-ca-trust-extracted\") pod \"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8\" (UID: \"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8\") " Feb 24 10:24:48 crc kubenswrapper[4698]: I0224 10:24:48.156478 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9ded6944-ff06-4cd5-beef-4dbb3cb9aba8-registry-tls\") pod \"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8\" (UID: \"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8\") " Feb 24 10:24:48 crc kubenswrapper[4698]: I0224 10:24:48.156599 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8\" (UID: \"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8\") " Feb 24 10:24:48 crc kubenswrapper[4698]: I0224 10:24:48.158113 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ded6944-ff06-4cd5-beef-4dbb3cb9aba8-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9ded6944-ff06-4cd5-beef-4dbb3cb9aba8" (UID: "9ded6944-ff06-4cd5-beef-4dbb3cb9aba8"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:24:48 crc kubenswrapper[4698]: I0224 10:24:48.158316 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ded6944-ff06-4cd5-beef-4dbb3cb9aba8-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "9ded6944-ff06-4cd5-beef-4dbb3cb9aba8" (UID: "9ded6944-ff06-4cd5-beef-4dbb3cb9aba8"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:24:48 crc kubenswrapper[4698]: I0224 10:24:48.163631 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ded6944-ff06-4cd5-beef-4dbb3cb9aba8-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "9ded6944-ff06-4cd5-beef-4dbb3cb9aba8" (UID: "9ded6944-ff06-4cd5-beef-4dbb3cb9aba8"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:24:48 crc kubenswrapper[4698]: I0224 10:24:48.164386 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ded6944-ff06-4cd5-beef-4dbb3cb9aba8-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "9ded6944-ff06-4cd5-beef-4dbb3cb9aba8" (UID: "9ded6944-ff06-4cd5-beef-4dbb3cb9aba8"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:24:48 crc kubenswrapper[4698]: I0224 10:24:48.165188 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ded6944-ff06-4cd5-beef-4dbb3cb9aba8-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "9ded6944-ff06-4cd5-beef-4dbb3cb9aba8" (UID: "9ded6944-ff06-4cd5-beef-4dbb3cb9aba8"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:24:48 crc kubenswrapper[4698]: I0224 10:24:48.167184 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ded6944-ff06-4cd5-beef-4dbb3cb9aba8-kube-api-access-5wzkz" (OuterVolumeSpecName: "kube-api-access-5wzkz") pod "9ded6944-ff06-4cd5-beef-4dbb3cb9aba8" (UID: "9ded6944-ff06-4cd5-beef-4dbb3cb9aba8"). InnerVolumeSpecName "kube-api-access-5wzkz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:24:48 crc kubenswrapper[4698]: I0224 10:24:48.179881 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ded6944-ff06-4cd5-beef-4dbb3cb9aba8-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "9ded6944-ff06-4cd5-beef-4dbb3cb9aba8" (UID: "9ded6944-ff06-4cd5-beef-4dbb3cb9aba8"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 10:24:48 crc kubenswrapper[4698]: I0224 10:24:48.186626 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "9ded6944-ff06-4cd5-beef-4dbb3cb9aba8" (UID: "9ded6944-ff06-4cd5-beef-4dbb3cb9aba8"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 24 10:24:48 crc kubenswrapper[4698]: I0224 10:24:48.258483 4698 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5wzkz\" (UniqueName: \"kubernetes.io/projected/9ded6944-ff06-4cd5-beef-4dbb3cb9aba8-kube-api-access-5wzkz\") on node \"crc\" DevicePath \"\"" Feb 24 10:24:48 crc kubenswrapper[4698]: I0224 10:24:48.258563 4698 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9ded6944-ff06-4cd5-beef-4dbb3cb9aba8-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 24 10:24:48 crc kubenswrapper[4698]: I0224 10:24:48.258574 4698 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9ded6944-ff06-4cd5-beef-4dbb3cb9aba8-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 24 10:24:48 crc kubenswrapper[4698]: I0224 10:24:48.258585 4698 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9ded6944-ff06-4cd5-beef-4dbb3cb9aba8-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 24 10:24:48 crc kubenswrapper[4698]: I0224 10:24:48.258595 4698 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9ded6944-ff06-4cd5-beef-4dbb3cb9aba8-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 24 10:24:48 crc kubenswrapper[4698]: I0224 10:24:48.258602 4698 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9ded6944-ff06-4cd5-beef-4dbb3cb9aba8-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 24 10:24:48 crc kubenswrapper[4698]: I0224 10:24:48.258630 4698 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9ded6944-ff06-4cd5-beef-4dbb3cb9aba8-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 24 10:24:48 crc kubenswrapper[4698]: I0224 10:24:48.802722 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-bmr2l" event={"ID":"9ded6944-ff06-4cd5-beef-4dbb3cb9aba8","Type":"ContainerDied","Data":"cd4d43d6cfa657e787a7689307d1aa88f14ab135f434fe79bab08c643b2f70c4"} Feb 24 10:24:48 crc kubenswrapper[4698]: I0224 10:24:48.802790 4698 scope.go:117] "RemoveContainer" containerID="dbe5d65c1a1d317becc5cb56edfb4174efe2e60d6ae5614685e4bda391a02b7f" Feb 24 10:24:48 crc kubenswrapper[4698]: I0224 10:24:48.802922 4698 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-bmr2l" Feb 24 10:24:48 crc kubenswrapper[4698]: I0224 10:24:48.868909 4698 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-bmr2l"] Feb 24 10:24:48 crc kubenswrapper[4698]: I0224 10:24:48.875371 4698 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-bmr2l"] Feb 24 10:24:49 crc kubenswrapper[4698]: I0224 10:24:49.626478 4698 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ded6944-ff06-4cd5-beef-4dbb3cb9aba8" path="/var/lib/kubelet/pods/9ded6944-ff06-4cd5-beef-4dbb3cb9aba8/volumes" Feb 24 10:26:22 crc kubenswrapper[4698]: I0224 10:26:22.196247 4698 patch_prober.go:28] interesting pod/machine-config-daemon-nn578 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 10:26:22 crc kubenswrapper[4698]: I0224 10:26:22.196803 4698 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nn578" podUID="b4ee0bb1-125d-4852-a54d-7dadf6177545" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 10:26:52 crc kubenswrapper[4698]: I0224 10:26:52.196760 4698 patch_prober.go:28] interesting pod/machine-config-daemon-nn578 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 10:26:52 crc kubenswrapper[4698]: I0224 10:26:52.197315 4698 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nn578" podUID="b4ee0bb1-125d-4852-a54d-7dadf6177545" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 10:27:22 crc kubenswrapper[4698]: I0224 10:27:22.197544 4698 patch_prober.go:28] interesting pod/machine-config-daemon-nn578 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 10:27:22 crc kubenswrapper[4698]: I0224 10:27:22.198134 4698 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nn578" podUID="b4ee0bb1-125d-4852-a54d-7dadf6177545" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 10:27:22 crc kubenswrapper[4698]: I0224 10:27:22.198197 4698 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-nn578" Feb 24 10:27:22 crc kubenswrapper[4698]: I0224 10:27:22.198820 4698 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0b7da0d5fae2f1471fcf65125ad5cf893f00a676ecd1a2c2a431023ddbdfc83e"} pod="openshift-machine-config-operator/machine-config-daemon-nn578" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 24 10:27:22 crc kubenswrapper[4698]: I0224 10:27:22.198900 4698 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-nn578" podUID="b4ee0bb1-125d-4852-a54d-7dadf6177545" containerName="machine-config-daemon" containerID="cri-o://0b7da0d5fae2f1471fcf65125ad5cf893f00a676ecd1a2c2a431023ddbdfc83e" gracePeriod=600 Feb 24 10:27:22 crc kubenswrapper[4698]: I0224 10:27:22.804048 4698 generic.go:334] "Generic (PLEG): container finished" podID="b4ee0bb1-125d-4852-a54d-7dadf6177545" containerID="0b7da0d5fae2f1471fcf65125ad5cf893f00a676ecd1a2c2a431023ddbdfc83e" exitCode=0 Feb 24 10:27:22 crc kubenswrapper[4698]: I0224 10:27:22.804144 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-nn578" event={"ID":"b4ee0bb1-125d-4852-a54d-7dadf6177545","Type":"ContainerDied","Data":"0b7da0d5fae2f1471fcf65125ad5cf893f00a676ecd1a2c2a431023ddbdfc83e"} Feb 24 10:27:22 crc kubenswrapper[4698]: I0224 10:27:22.804529 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-nn578" event={"ID":"b4ee0bb1-125d-4852-a54d-7dadf6177545","Type":"ContainerStarted","Data":"829b9213c4c673d3133873424826a5ea12ee4cbf361962bd4c39f0f65c6f48c4"} Feb 24 10:27:22 crc kubenswrapper[4698]: I0224 10:27:22.804635 4698 scope.go:117] "RemoveContainer" containerID="c26c2143394059f82ffa50e03f99ae3948741b5030a14c47db3d70836dce763e" Feb 24 10:27:40 crc kubenswrapper[4698]: I0224 10:27:40.871702 4698 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-gqw6x"] Feb 24 10:27:40 crc kubenswrapper[4698]: E0224 10:27:40.872404 4698 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ded6944-ff06-4cd5-beef-4dbb3cb9aba8" containerName="registry" Feb 24 10:27:40 crc kubenswrapper[4698]: I0224 10:27:40.872415 4698 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ded6944-ff06-4cd5-beef-4dbb3cb9aba8" containerName="registry" Feb 24 10:27:40 crc kubenswrapper[4698]: I0224 10:27:40.872507 4698 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ded6944-ff06-4cd5-beef-4dbb3cb9aba8" containerName="registry" Feb 24 10:27:40 crc kubenswrapper[4698]: I0224 10:27:40.872820 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-gqw6x" Feb 24 10:27:40 crc kubenswrapper[4698]: I0224 10:27:40.875253 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Feb 24 10:27:40 crc kubenswrapper[4698]: I0224 10:27:40.875636 4698 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-r2mbb" Feb 24 10:27:40 crc kubenswrapper[4698]: I0224 10:27:40.875893 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Feb 24 10:27:40 crc kubenswrapper[4698]: I0224 10:27:40.884341 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-gqw6x"] Feb 24 10:27:40 crc kubenswrapper[4698]: I0224 10:27:40.890780 4698 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-hfbhk"] Feb 24 10:27:40 crc kubenswrapper[4698]: I0224 10:27:40.891679 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-hfbhk" Feb 24 10:27:40 crc kubenswrapper[4698]: I0224 10:27:40.893235 4698 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-7hlvd" Feb 24 10:27:40 crc kubenswrapper[4698]: I0224 10:27:40.909741 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-hfbhk"] Feb 24 10:27:40 crc kubenswrapper[4698]: I0224 10:27:40.917950 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtgwb\" (UniqueName: \"kubernetes.io/projected/fe274629-9e47-49f8-9143-197a20079184-kube-api-access-gtgwb\") pod \"cert-manager-858654f9db-hfbhk\" (UID: \"fe274629-9e47-49f8-9143-197a20079184\") " pod="cert-manager/cert-manager-858654f9db-hfbhk" Feb 24 10:27:40 crc kubenswrapper[4698]: I0224 10:27:40.918004 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmzj6\" (UniqueName: \"kubernetes.io/projected/391d35ea-1367-4774-8059-e2914203784f-kube-api-access-xmzj6\") pod \"cert-manager-cainjector-cf98fcc89-gqw6x\" (UID: \"391d35ea-1367-4774-8059-e2914203784f\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-gqw6x" Feb 24 10:27:40 crc kubenswrapper[4698]: I0224 10:27:40.928856 4698 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-v4cfs"] Feb 24 10:27:40 crc kubenswrapper[4698]: I0224 10:27:40.929941 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-v4cfs" Feb 24 10:27:40 crc kubenswrapper[4698]: I0224 10:27:40.931938 4698 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-7wc77" Feb 24 10:27:40 crc kubenswrapper[4698]: I0224 10:27:40.947391 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-v4cfs"] Feb 24 10:27:41 crc kubenswrapper[4698]: I0224 10:27:41.019377 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtgwb\" (UniqueName: \"kubernetes.io/projected/fe274629-9e47-49f8-9143-197a20079184-kube-api-access-gtgwb\") pod \"cert-manager-858654f9db-hfbhk\" (UID: \"fe274629-9e47-49f8-9143-197a20079184\") " pod="cert-manager/cert-manager-858654f9db-hfbhk" Feb 24 10:27:41 crc kubenswrapper[4698]: I0224 10:27:41.019430 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmzj6\" (UniqueName: \"kubernetes.io/projected/391d35ea-1367-4774-8059-e2914203784f-kube-api-access-xmzj6\") pod \"cert-manager-cainjector-cf98fcc89-gqw6x\" (UID: \"391d35ea-1367-4774-8059-e2914203784f\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-gqw6x" Feb 24 10:27:41 crc kubenswrapper[4698]: I0224 10:27:41.019469 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6qb2\" (UniqueName: \"kubernetes.io/projected/5b88d3e6-2f9e-4a75-81ac-5f19c5f91daf-kube-api-access-q6qb2\") pod \"cert-manager-webhook-687f57d79b-v4cfs\" (UID: \"5b88d3e6-2f9e-4a75-81ac-5f19c5f91daf\") " pod="cert-manager/cert-manager-webhook-687f57d79b-v4cfs" Feb 24 10:27:41 crc kubenswrapper[4698]: I0224 10:27:41.039631 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmzj6\" (UniqueName: \"kubernetes.io/projected/391d35ea-1367-4774-8059-e2914203784f-kube-api-access-xmzj6\") pod \"cert-manager-cainjector-cf98fcc89-gqw6x\" (UID: \"391d35ea-1367-4774-8059-e2914203784f\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-gqw6x" Feb 24 10:27:41 crc kubenswrapper[4698]: I0224 10:27:41.041849 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtgwb\" (UniqueName: \"kubernetes.io/projected/fe274629-9e47-49f8-9143-197a20079184-kube-api-access-gtgwb\") pod \"cert-manager-858654f9db-hfbhk\" (UID: \"fe274629-9e47-49f8-9143-197a20079184\") " pod="cert-manager/cert-manager-858654f9db-hfbhk" Feb 24 10:27:41 crc kubenswrapper[4698]: I0224 10:27:41.121084 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6qb2\" (UniqueName: \"kubernetes.io/projected/5b88d3e6-2f9e-4a75-81ac-5f19c5f91daf-kube-api-access-q6qb2\") pod \"cert-manager-webhook-687f57d79b-v4cfs\" (UID: \"5b88d3e6-2f9e-4a75-81ac-5f19c5f91daf\") " pod="cert-manager/cert-manager-webhook-687f57d79b-v4cfs" Feb 24 10:27:41 crc kubenswrapper[4698]: I0224 10:27:41.145515 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6qb2\" (UniqueName: \"kubernetes.io/projected/5b88d3e6-2f9e-4a75-81ac-5f19c5f91daf-kube-api-access-q6qb2\") pod \"cert-manager-webhook-687f57d79b-v4cfs\" (UID: \"5b88d3e6-2f9e-4a75-81ac-5f19c5f91daf\") " pod="cert-manager/cert-manager-webhook-687f57d79b-v4cfs" Feb 24 10:27:41 crc kubenswrapper[4698]: I0224 10:27:41.190988 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-gqw6x" Feb 24 10:27:41 crc kubenswrapper[4698]: I0224 10:27:41.212115 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-hfbhk" Feb 24 10:27:41 crc kubenswrapper[4698]: I0224 10:27:41.244957 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-v4cfs" Feb 24 10:27:41 crc kubenswrapper[4698]: I0224 10:27:41.392674 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-gqw6x"] Feb 24 10:27:41 crc kubenswrapper[4698]: I0224 10:27:41.414331 4698 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 24 10:27:41 crc kubenswrapper[4698]: I0224 10:27:41.503304 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-hfbhk"] Feb 24 10:27:41 crc kubenswrapper[4698]: I0224 10:27:41.527505 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-v4cfs"] Feb 24 10:27:41 crc kubenswrapper[4698]: W0224 10:27:41.530154 4698 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5b88d3e6_2f9e_4a75_81ac_5f19c5f91daf.slice/crio-6e19b4cf6ca50bbb343a94e158833179111798f77ef8104dfc82e5f13ea8f8e7 WatchSource:0}: Error finding container 6e19b4cf6ca50bbb343a94e158833179111798f77ef8104dfc82e5f13ea8f8e7: Status 404 returned error can't find the container with id 6e19b4cf6ca50bbb343a94e158833179111798f77ef8104dfc82e5f13ea8f8e7 Feb 24 10:27:41 crc kubenswrapper[4698]: I0224 10:27:41.944538 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-hfbhk" event={"ID":"fe274629-9e47-49f8-9143-197a20079184","Type":"ContainerStarted","Data":"05d8836d5aa053ea2c6370a28b616765f31d743f6b31d361aebad44d5765f0e3"} Feb 24 10:27:41 crc kubenswrapper[4698]: I0224 10:27:41.945900 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-gqw6x" event={"ID":"391d35ea-1367-4774-8059-e2914203784f","Type":"ContainerStarted","Data":"6bb6d48ce729b2830a408e178ece7aea78b55827bce9d8008f51b5fb328d12d9"} Feb 24 10:27:41 crc kubenswrapper[4698]: I0224 10:27:41.946931 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-v4cfs" event={"ID":"5b88d3e6-2f9e-4a75-81ac-5f19c5f91daf","Type":"ContainerStarted","Data":"6e19b4cf6ca50bbb343a94e158833179111798f77ef8104dfc82e5f13ea8f8e7"} Feb 24 10:27:49 crc kubenswrapper[4698]: I0224 10:27:49.996499 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-gqw6x" event={"ID":"391d35ea-1367-4774-8059-e2914203784f","Type":"ContainerStarted","Data":"3d5b42860996a31b20f9ae795604d67bf7166a459572933e116adc1839e4a1ea"} Feb 24 10:27:49 crc kubenswrapper[4698]: I0224 10:27:49.998345 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-v4cfs" event={"ID":"5b88d3e6-2f9e-4a75-81ac-5f19c5f91daf","Type":"ContainerStarted","Data":"94ee19d513f01a6ddade3a0d22b0e342375e40425fd1875f2e75e68a9b766934"} Feb 24 10:27:49 crc kubenswrapper[4698]: I0224 10:27:49.998472 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-v4cfs" Feb 24 10:27:49 crc kubenswrapper[4698]: I0224 10:27:49.999833 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-hfbhk" event={"ID":"fe274629-9e47-49f8-9143-197a20079184","Type":"ContainerStarted","Data":"51b83277f30518c51bc78fcb4bbb5704fc7091f2c60c31859b3be532bd60d253"} Feb 24 10:27:50 crc kubenswrapper[4698]: I0224 10:27:50.014617 4698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-gqw6x" podStartSLOduration=1.635240754 podStartE2EDuration="10.014597171s" podCreationTimestamp="2026-02-24 10:27:40 +0000 UTC" firstStartedPulling="2026-02-24 10:27:41.41401876 +0000 UTC m=+686.527633011" lastFinishedPulling="2026-02-24 10:27:49.793375147 +0000 UTC m=+694.906989428" observedRunningTime="2026-02-24 10:27:50.009993698 +0000 UTC m=+695.123607959" watchObservedRunningTime="2026-02-24 10:27:50.014597171 +0000 UTC m=+695.128211412" Feb 24 10:27:50 crc kubenswrapper[4698]: I0224 10:27:50.031157 4698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-hfbhk" podStartSLOduration=1.8931350789999999 podStartE2EDuration="10.031138412s" podCreationTimestamp="2026-02-24 10:27:40 +0000 UTC" firstStartedPulling="2026-02-24 10:27:41.507509681 +0000 UTC m=+686.621123922" lastFinishedPulling="2026-02-24 10:27:49.645513014 +0000 UTC m=+694.759127255" observedRunningTime="2026-02-24 10:27:50.02772215 +0000 UTC m=+695.141336401" watchObservedRunningTime="2026-02-24 10:27:50.031138412 +0000 UTC m=+695.144752643" Feb 24 10:27:50 crc kubenswrapper[4698]: I0224 10:27:50.047966 4698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-v4cfs" podStartSLOduration=1.925350813 podStartE2EDuration="10.047951532s" podCreationTimestamp="2026-02-24 10:27:40 +0000 UTC" firstStartedPulling="2026-02-24 10:27:41.533119613 +0000 UTC m=+686.646733854" lastFinishedPulling="2026-02-24 10:27:49.655720312 +0000 UTC m=+694.769334573" observedRunningTime="2026-02-24 10:27:50.046375982 +0000 UTC m=+695.159990223" watchObservedRunningTime="2026-02-24 10:27:50.047951532 +0000 UTC m=+695.161565773" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.067108 4698 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-mgh7p"] Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.067586 4698 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" podUID="066df704-6981-4770-a647-df52a0da50a0" containerName="ovn-controller" containerID="cri-o://096010abeb5f4fc1cf8ab2a1a3e50000365a449d0747081df923bde1be7e1213" gracePeriod=30 Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.067710 4698 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" podUID="066df704-6981-4770-a647-df52a0da50a0" containerName="kube-rbac-proxy-node" containerID="cri-o://60215d9a7dc3fbaa1b045a76c018c910f3748c5bef5325716e0a28844bc91ece" gracePeriod=30 Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.067677 4698 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" podUID="066df704-6981-4770-a647-df52a0da50a0" containerName="nbdb" containerID="cri-o://7adc5b73bdd01b1e822308534c8848e154a1d05ed5367b971b59a99289387585" gracePeriod=30 Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.067739 4698 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" podUID="066df704-6981-4770-a647-df52a0da50a0" containerName="northd" containerID="cri-o://f2ec337c851d86c491d1ae5a667e4344ae4759f945b423d3a48838874a6eda20" gracePeriod=30 Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.067823 4698 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" podUID="066df704-6981-4770-a647-df52a0da50a0" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://5e27ae8c6aa803d58f6ff0252273d2fcbbee794c49a13fc54bfe6677b5aa6e07" gracePeriod=30 Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.067757 4698 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" podUID="066df704-6981-4770-a647-df52a0da50a0" containerName="ovn-acl-logging" containerID="cri-o://444da705b890c795bca82d2bd44ad5b71ed9bcc95a70ee5c92755679af31aa99" gracePeriod=30 Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.067751 4698 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" podUID="066df704-6981-4770-a647-df52a0da50a0" containerName="sbdb" containerID="cri-o://1288272246b8937c2880153451d797fc3328749902e2491e60c8f8f086c85288" gracePeriod=30 Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.104513 4698 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" podUID="066df704-6981-4770-a647-df52a0da50a0" containerName="ovnkube-controller" containerID="cri-o://1058035c4f9ec53ded52f7e95037f92da16967a87ba1ef415eb2df5ed366da4c" gracePeriod=30 Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.455130 4698 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mgh7p_066df704-6981-4770-a647-df52a0da50a0/ovnkube-controller/3.log" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.457880 4698 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mgh7p_066df704-6981-4770-a647-df52a0da50a0/ovn-acl-logging/0.log" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.458920 4698 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mgh7p_066df704-6981-4770-a647-df52a0da50a0/ovn-controller/0.log" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.459380 4698 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.534945 4698 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-s4js6"] Feb 24 10:27:51 crc kubenswrapper[4698]: E0224 10:27:51.535230 4698 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="066df704-6981-4770-a647-df52a0da50a0" containerName="ovnkube-controller" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.535256 4698 state_mem.go:107] "Deleted CPUSet assignment" podUID="066df704-6981-4770-a647-df52a0da50a0" containerName="ovnkube-controller" Feb 24 10:27:51 crc kubenswrapper[4698]: E0224 10:27:51.535291 4698 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="066df704-6981-4770-a647-df52a0da50a0" containerName="ovnkube-controller" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.535303 4698 state_mem.go:107] "Deleted CPUSet assignment" podUID="066df704-6981-4770-a647-df52a0da50a0" containerName="ovnkube-controller" Feb 24 10:27:51 crc kubenswrapper[4698]: E0224 10:27:51.535331 4698 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="066df704-6981-4770-a647-df52a0da50a0" containerName="kubecfg-setup" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.535343 4698 state_mem.go:107] "Deleted CPUSet assignment" podUID="066df704-6981-4770-a647-df52a0da50a0" containerName="kubecfg-setup" Feb 24 10:27:51 crc kubenswrapper[4698]: E0224 10:27:51.535357 4698 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="066df704-6981-4770-a647-df52a0da50a0" containerName="kube-rbac-proxy-ovn-metrics" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.535367 4698 state_mem.go:107] "Deleted CPUSet assignment" podUID="066df704-6981-4770-a647-df52a0da50a0" containerName="kube-rbac-proxy-ovn-metrics" Feb 24 10:27:51 crc kubenswrapper[4698]: E0224 10:27:51.535382 4698 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="066df704-6981-4770-a647-df52a0da50a0" containerName="ovn-controller" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.535392 4698 state_mem.go:107] "Deleted CPUSet assignment" podUID="066df704-6981-4770-a647-df52a0da50a0" containerName="ovn-controller" Feb 24 10:27:51 crc kubenswrapper[4698]: E0224 10:27:51.535431 4698 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="066df704-6981-4770-a647-df52a0da50a0" containerName="nbdb" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.535443 4698 state_mem.go:107] "Deleted CPUSet assignment" podUID="066df704-6981-4770-a647-df52a0da50a0" containerName="nbdb" Feb 24 10:27:51 crc kubenswrapper[4698]: E0224 10:27:51.535453 4698 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="066df704-6981-4770-a647-df52a0da50a0" containerName="sbdb" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.535463 4698 state_mem.go:107] "Deleted CPUSet assignment" podUID="066df704-6981-4770-a647-df52a0da50a0" containerName="sbdb" Feb 24 10:27:51 crc kubenswrapper[4698]: E0224 10:27:51.535478 4698 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="066df704-6981-4770-a647-df52a0da50a0" containerName="ovnkube-controller" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.535489 4698 state_mem.go:107] "Deleted CPUSet assignment" podUID="066df704-6981-4770-a647-df52a0da50a0" containerName="ovnkube-controller" Feb 24 10:27:51 crc kubenswrapper[4698]: E0224 10:27:51.535499 4698 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="066df704-6981-4770-a647-df52a0da50a0" containerName="ovnkube-controller" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.535509 4698 state_mem.go:107] "Deleted CPUSet assignment" podUID="066df704-6981-4770-a647-df52a0da50a0" containerName="ovnkube-controller" Feb 24 10:27:51 crc kubenswrapper[4698]: E0224 10:27:51.535535 4698 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="066df704-6981-4770-a647-df52a0da50a0" containerName="northd" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.535545 4698 state_mem.go:107] "Deleted CPUSet assignment" podUID="066df704-6981-4770-a647-df52a0da50a0" containerName="northd" Feb 24 10:27:51 crc kubenswrapper[4698]: E0224 10:27:51.535563 4698 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="066df704-6981-4770-a647-df52a0da50a0" containerName="ovn-acl-logging" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.535573 4698 state_mem.go:107] "Deleted CPUSet assignment" podUID="066df704-6981-4770-a647-df52a0da50a0" containerName="ovn-acl-logging" Feb 24 10:27:51 crc kubenswrapper[4698]: E0224 10:27:51.535584 4698 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="066df704-6981-4770-a647-df52a0da50a0" containerName="kube-rbac-proxy-node" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.535594 4698 state_mem.go:107] "Deleted CPUSet assignment" podUID="066df704-6981-4770-a647-df52a0da50a0" containerName="kube-rbac-proxy-node" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.535755 4698 memory_manager.go:354] "RemoveStaleState removing state" podUID="066df704-6981-4770-a647-df52a0da50a0" containerName="ovn-controller" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.535773 4698 memory_manager.go:354] "RemoveStaleState removing state" podUID="066df704-6981-4770-a647-df52a0da50a0" containerName="ovnkube-controller" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.535790 4698 memory_manager.go:354] "RemoveStaleState removing state" podUID="066df704-6981-4770-a647-df52a0da50a0" containerName="kube-rbac-proxy-node" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.535802 4698 memory_manager.go:354] "RemoveStaleState removing state" podUID="066df704-6981-4770-a647-df52a0da50a0" containerName="ovn-acl-logging" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.535814 4698 memory_manager.go:354] "RemoveStaleState removing state" podUID="066df704-6981-4770-a647-df52a0da50a0" containerName="northd" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.535829 4698 memory_manager.go:354] "RemoveStaleState removing state" podUID="066df704-6981-4770-a647-df52a0da50a0" containerName="sbdb" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.535844 4698 memory_manager.go:354] "RemoveStaleState removing state" podUID="066df704-6981-4770-a647-df52a0da50a0" containerName="nbdb" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.535857 4698 memory_manager.go:354] "RemoveStaleState removing state" podUID="066df704-6981-4770-a647-df52a0da50a0" containerName="ovnkube-controller" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.535898 4698 memory_manager.go:354] "RemoveStaleState removing state" podUID="066df704-6981-4770-a647-df52a0da50a0" containerName="kube-rbac-proxy-ovn-metrics" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.535912 4698 memory_manager.go:354] "RemoveStaleState removing state" podUID="066df704-6981-4770-a647-df52a0da50a0" containerName="ovnkube-controller" Feb 24 10:27:51 crc kubenswrapper[4698]: E0224 10:27:51.536075 4698 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="066df704-6981-4770-a647-df52a0da50a0" containerName="ovnkube-controller" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.536089 4698 state_mem.go:107] "Deleted CPUSet assignment" podUID="066df704-6981-4770-a647-df52a0da50a0" containerName="ovnkube-controller" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.536244 4698 memory_manager.go:354] "RemoveStaleState removing state" podUID="066df704-6981-4770-a647-df52a0da50a0" containerName="ovnkube-controller" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.546163 4698 memory_manager.go:354] "RemoveStaleState removing state" podUID="066df704-6981-4770-a647-df52a0da50a0" containerName="ovnkube-controller" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.548369 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-s4js6" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.559986 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/066df704-6981-4770-a647-df52a0da50a0-host-cni-bin\") pod \"066df704-6981-4770-a647-df52a0da50a0\" (UID: \"066df704-6981-4770-a647-df52a0da50a0\") " Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.560037 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/066df704-6981-4770-a647-df52a0da50a0-log-socket\") pod \"066df704-6981-4770-a647-df52a0da50a0\" (UID: \"066df704-6981-4770-a647-df52a0da50a0\") " Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.560076 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/066df704-6981-4770-a647-df52a0da50a0-host-run-netns\") pod \"066df704-6981-4770-a647-df52a0da50a0\" (UID: \"066df704-6981-4770-a647-df52a0da50a0\") " Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.560118 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/066df704-6981-4770-a647-df52a0da50a0-ovnkube-config\") pod \"066df704-6981-4770-a647-df52a0da50a0\" (UID: \"066df704-6981-4770-a647-df52a0da50a0\") " Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.560114 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/066df704-6981-4770-a647-df52a0da50a0-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "066df704-6981-4770-a647-df52a0da50a0" (UID: "066df704-6981-4770-a647-df52a0da50a0"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.560149 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/066df704-6981-4770-a647-df52a0da50a0-host-run-ovn-kubernetes\") pod \"066df704-6981-4770-a647-df52a0da50a0\" (UID: \"066df704-6981-4770-a647-df52a0da50a0\") " Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.560175 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/066df704-6981-4770-a647-df52a0da50a0-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "066df704-6981-4770-a647-df52a0da50a0" (UID: "066df704-6981-4770-a647-df52a0da50a0"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.560190 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/066df704-6981-4770-a647-df52a0da50a0-run-ovn\") pod \"066df704-6981-4770-a647-df52a0da50a0\" (UID: \"066df704-6981-4770-a647-df52a0da50a0\") " Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.560226 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/066df704-6981-4770-a647-df52a0da50a0-systemd-units\") pod \"066df704-6981-4770-a647-df52a0da50a0\" (UID: \"066df704-6981-4770-a647-df52a0da50a0\") " Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.560293 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/066df704-6981-4770-a647-df52a0da50a0-ovnkube-script-lib\") pod \"066df704-6981-4770-a647-df52a0da50a0\" (UID: \"066df704-6981-4770-a647-df52a0da50a0\") " Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.560331 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/066df704-6981-4770-a647-df52a0da50a0-host-cni-netd\") pod \"066df704-6981-4770-a647-df52a0da50a0\" (UID: \"066df704-6981-4770-a647-df52a0da50a0\") " Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.560357 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/066df704-6981-4770-a647-df52a0da50a0-host-kubelet\") pod \"066df704-6981-4770-a647-df52a0da50a0\" (UID: \"066df704-6981-4770-a647-df52a0da50a0\") " Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.560386 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/066df704-6981-4770-a647-df52a0da50a0-env-overrides\") pod \"066df704-6981-4770-a647-df52a0da50a0\" (UID: \"066df704-6981-4770-a647-df52a0da50a0\") " Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.560420 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l2k2d\" (UniqueName: \"kubernetes.io/projected/066df704-6981-4770-a647-df52a0da50a0-kube-api-access-l2k2d\") pod \"066df704-6981-4770-a647-df52a0da50a0\" (UID: \"066df704-6981-4770-a647-df52a0da50a0\") " Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.560456 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/066df704-6981-4770-a647-df52a0da50a0-run-openvswitch\") pod \"066df704-6981-4770-a647-df52a0da50a0\" (UID: \"066df704-6981-4770-a647-df52a0da50a0\") " Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.560505 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/066df704-6981-4770-a647-df52a0da50a0-ovn-node-metrics-cert\") pod \"066df704-6981-4770-a647-df52a0da50a0\" (UID: \"066df704-6981-4770-a647-df52a0da50a0\") " Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.560539 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/066df704-6981-4770-a647-df52a0da50a0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"066df704-6981-4770-a647-df52a0da50a0\" (UID: \"066df704-6981-4770-a647-df52a0da50a0\") " Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.560570 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/066df704-6981-4770-a647-df52a0da50a0-host-slash\") pod \"066df704-6981-4770-a647-df52a0da50a0\" (UID: \"066df704-6981-4770-a647-df52a0da50a0\") " Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.560606 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/066df704-6981-4770-a647-df52a0da50a0-node-log\") pod \"066df704-6981-4770-a647-df52a0da50a0\" (UID: \"066df704-6981-4770-a647-df52a0da50a0\") " Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.560639 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/066df704-6981-4770-a647-df52a0da50a0-run-systemd\") pod \"066df704-6981-4770-a647-df52a0da50a0\" (UID: \"066df704-6981-4770-a647-df52a0da50a0\") " Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.560671 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/066df704-6981-4770-a647-df52a0da50a0-var-lib-openvswitch\") pod \"066df704-6981-4770-a647-df52a0da50a0\" (UID: \"066df704-6981-4770-a647-df52a0da50a0\") " Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.560701 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/066df704-6981-4770-a647-df52a0da50a0-etc-openvswitch\") pod \"066df704-6981-4770-a647-df52a0da50a0\" (UID: \"066df704-6981-4770-a647-df52a0da50a0\") " Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.560811 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0c229880-23d0-4fb6-b111-883f50038f6d-run-openvswitch\") pod \"ovnkube-node-s4js6\" (UID: \"0c229880-23d0-4fb6-b111-883f50038f6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4js6" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.560852 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87nzp\" (UniqueName: \"kubernetes.io/projected/0c229880-23d0-4fb6-b111-883f50038f6d-kube-api-access-87nzp\") pod \"ovnkube-node-s4js6\" (UID: \"0c229880-23d0-4fb6-b111-883f50038f6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4js6" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.560893 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0c229880-23d0-4fb6-b111-883f50038f6d-run-ovn\") pod \"ovnkube-node-s4js6\" (UID: \"0c229880-23d0-4fb6-b111-883f50038f6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4js6" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.560991 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0c229880-23d0-4fb6-b111-883f50038f6d-systemd-units\") pod \"ovnkube-node-s4js6\" (UID: \"0c229880-23d0-4fb6-b111-883f50038f6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4js6" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.561026 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0c229880-23d0-4fb6-b111-883f50038f6d-etc-openvswitch\") pod \"ovnkube-node-s4js6\" (UID: \"0c229880-23d0-4fb6-b111-883f50038f6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4js6" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.561057 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0c229880-23d0-4fb6-b111-883f50038f6d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-s4js6\" (UID: \"0c229880-23d0-4fb6-b111-883f50038f6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4js6" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.561105 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0c229880-23d0-4fb6-b111-883f50038f6d-ovn-node-metrics-cert\") pod \"ovnkube-node-s4js6\" (UID: \"0c229880-23d0-4fb6-b111-883f50038f6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4js6" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.561137 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0c229880-23d0-4fb6-b111-883f50038f6d-host-cni-netd\") pod \"ovnkube-node-s4js6\" (UID: \"0c229880-23d0-4fb6-b111-883f50038f6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4js6" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.561181 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0c229880-23d0-4fb6-b111-883f50038f6d-var-lib-openvswitch\") pod \"ovnkube-node-s4js6\" (UID: \"0c229880-23d0-4fb6-b111-883f50038f6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4js6" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.561215 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0c229880-23d0-4fb6-b111-883f50038f6d-host-cni-bin\") pod \"ovnkube-node-s4js6\" (UID: \"0c229880-23d0-4fb6-b111-883f50038f6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4js6" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.561248 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0c229880-23d0-4fb6-b111-883f50038f6d-ovnkube-script-lib\") pod \"ovnkube-node-s4js6\" (UID: \"0c229880-23d0-4fb6-b111-883f50038f6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4js6" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.561332 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0c229880-23d0-4fb6-b111-883f50038f6d-host-slash\") pod \"ovnkube-node-s4js6\" (UID: \"0c229880-23d0-4fb6-b111-883f50038f6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4js6" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.561373 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0c229880-23d0-4fb6-b111-883f50038f6d-host-run-netns\") pod \"ovnkube-node-s4js6\" (UID: \"0c229880-23d0-4fb6-b111-883f50038f6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4js6" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.561405 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0c229880-23d0-4fb6-b111-883f50038f6d-host-run-ovn-kubernetes\") pod \"ovnkube-node-s4js6\" (UID: \"0c229880-23d0-4fb6-b111-883f50038f6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4js6" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.561435 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0c229880-23d0-4fb6-b111-883f50038f6d-node-log\") pod \"ovnkube-node-s4js6\" (UID: \"0c229880-23d0-4fb6-b111-883f50038f6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4js6" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.561467 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0c229880-23d0-4fb6-b111-883f50038f6d-log-socket\") pod \"ovnkube-node-s4js6\" (UID: \"0c229880-23d0-4fb6-b111-883f50038f6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4js6" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.561503 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0c229880-23d0-4fb6-b111-883f50038f6d-ovnkube-config\") pod \"ovnkube-node-s4js6\" (UID: \"0c229880-23d0-4fb6-b111-883f50038f6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4js6" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.561533 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0c229880-23d0-4fb6-b111-883f50038f6d-run-systemd\") pod \"ovnkube-node-s4js6\" (UID: \"0c229880-23d0-4fb6-b111-883f50038f6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4js6" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.561561 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0c229880-23d0-4fb6-b111-883f50038f6d-env-overrides\") pod \"ovnkube-node-s4js6\" (UID: \"0c229880-23d0-4fb6-b111-883f50038f6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4js6" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.561592 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0c229880-23d0-4fb6-b111-883f50038f6d-host-kubelet\") pod \"ovnkube-node-s4js6\" (UID: \"0c229880-23d0-4fb6-b111-883f50038f6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4js6" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.561653 4698 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/066df704-6981-4770-a647-df52a0da50a0-host-cni-bin\") on node \"crc\" DevicePath \"\"" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.561672 4698 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/066df704-6981-4770-a647-df52a0da50a0-host-run-netns\") on node \"crc\" DevicePath \"\"" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.560197 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/066df704-6981-4770-a647-df52a0da50a0-log-socket" (OuterVolumeSpecName: "log-socket") pod "066df704-6981-4770-a647-df52a0da50a0" (UID: "066df704-6981-4770-a647-df52a0da50a0"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.560654 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/066df704-6981-4770-a647-df52a0da50a0-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "066df704-6981-4770-a647-df52a0da50a0" (UID: "066df704-6981-4770-a647-df52a0da50a0"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.561757 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/066df704-6981-4770-a647-df52a0da50a0-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "066df704-6981-4770-a647-df52a0da50a0" (UID: "066df704-6981-4770-a647-df52a0da50a0"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.561787 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/066df704-6981-4770-a647-df52a0da50a0-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "066df704-6981-4770-a647-df52a0da50a0" (UID: "066df704-6981-4770-a647-df52a0da50a0"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.562006 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/066df704-6981-4770-a647-df52a0da50a0-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "066df704-6981-4770-a647-df52a0da50a0" (UID: "066df704-6981-4770-a647-df52a0da50a0"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.562034 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/066df704-6981-4770-a647-df52a0da50a0-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "066df704-6981-4770-a647-df52a0da50a0" (UID: "066df704-6981-4770-a647-df52a0da50a0"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.562059 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/066df704-6981-4770-a647-df52a0da50a0-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "066df704-6981-4770-a647-df52a0da50a0" (UID: "066df704-6981-4770-a647-df52a0da50a0"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.562149 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/066df704-6981-4770-a647-df52a0da50a0-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "066df704-6981-4770-a647-df52a0da50a0" (UID: "066df704-6981-4770-a647-df52a0da50a0"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.562709 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/066df704-6981-4770-a647-df52a0da50a0-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "066df704-6981-4770-a647-df52a0da50a0" (UID: "066df704-6981-4770-a647-df52a0da50a0"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.562759 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/066df704-6981-4770-a647-df52a0da50a0-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "066df704-6981-4770-a647-df52a0da50a0" (UID: "066df704-6981-4770-a647-df52a0da50a0"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.562963 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/066df704-6981-4770-a647-df52a0da50a0-host-slash" (OuterVolumeSpecName: "host-slash") pod "066df704-6981-4770-a647-df52a0da50a0" (UID: "066df704-6981-4770-a647-df52a0da50a0"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.563102 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/066df704-6981-4770-a647-df52a0da50a0-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "066df704-6981-4770-a647-df52a0da50a0" (UID: "066df704-6981-4770-a647-df52a0da50a0"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.566122 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/066df704-6981-4770-a647-df52a0da50a0-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "066df704-6981-4770-a647-df52a0da50a0" (UID: "066df704-6981-4770-a647-df52a0da50a0"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.566158 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/066df704-6981-4770-a647-df52a0da50a0-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "066df704-6981-4770-a647-df52a0da50a0" (UID: "066df704-6981-4770-a647-df52a0da50a0"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.566179 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/066df704-6981-4770-a647-df52a0da50a0-node-log" (OuterVolumeSpecName: "node-log") pod "066df704-6981-4770-a647-df52a0da50a0" (UID: "066df704-6981-4770-a647-df52a0da50a0"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.574498 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/066df704-6981-4770-a647-df52a0da50a0-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "066df704-6981-4770-a647-df52a0da50a0" (UID: "066df704-6981-4770-a647-df52a0da50a0"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.579822 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/066df704-6981-4770-a647-df52a0da50a0-kube-api-access-l2k2d" (OuterVolumeSpecName: "kube-api-access-l2k2d") pod "066df704-6981-4770-a647-df52a0da50a0" (UID: "066df704-6981-4770-a647-df52a0da50a0"). InnerVolumeSpecName "kube-api-access-l2k2d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.587765 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/066df704-6981-4770-a647-df52a0da50a0-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "066df704-6981-4770-a647-df52a0da50a0" (UID: "066df704-6981-4770-a647-df52a0da50a0"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.662642 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0c229880-23d0-4fb6-b111-883f50038f6d-etc-openvswitch\") pod \"ovnkube-node-s4js6\" (UID: \"0c229880-23d0-4fb6-b111-883f50038f6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4js6" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.662688 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0c229880-23d0-4fb6-b111-883f50038f6d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-s4js6\" (UID: \"0c229880-23d0-4fb6-b111-883f50038f6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4js6" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.662739 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0c229880-23d0-4fb6-b111-883f50038f6d-ovn-node-metrics-cert\") pod \"ovnkube-node-s4js6\" (UID: \"0c229880-23d0-4fb6-b111-883f50038f6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4js6" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.662761 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0c229880-23d0-4fb6-b111-883f50038f6d-host-cni-netd\") pod \"ovnkube-node-s4js6\" (UID: \"0c229880-23d0-4fb6-b111-883f50038f6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4js6" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.662792 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0c229880-23d0-4fb6-b111-883f50038f6d-var-lib-openvswitch\") pod \"ovnkube-node-s4js6\" (UID: \"0c229880-23d0-4fb6-b111-883f50038f6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4js6" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.662788 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0c229880-23d0-4fb6-b111-883f50038f6d-etc-openvswitch\") pod \"ovnkube-node-s4js6\" (UID: \"0c229880-23d0-4fb6-b111-883f50038f6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4js6" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.662804 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0c229880-23d0-4fb6-b111-883f50038f6d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-s4js6\" (UID: \"0c229880-23d0-4fb6-b111-883f50038f6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4js6" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.662814 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0c229880-23d0-4fb6-b111-883f50038f6d-host-cni-bin\") pod \"ovnkube-node-s4js6\" (UID: \"0c229880-23d0-4fb6-b111-883f50038f6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4js6" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.662857 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0c229880-23d0-4fb6-b111-883f50038f6d-host-cni-bin\") pod \"ovnkube-node-s4js6\" (UID: \"0c229880-23d0-4fb6-b111-883f50038f6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4js6" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.662874 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0c229880-23d0-4fb6-b111-883f50038f6d-ovnkube-script-lib\") pod \"ovnkube-node-s4js6\" (UID: \"0c229880-23d0-4fb6-b111-883f50038f6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4js6" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.662901 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0c229880-23d0-4fb6-b111-883f50038f6d-host-cni-netd\") pod \"ovnkube-node-s4js6\" (UID: \"0c229880-23d0-4fb6-b111-883f50038f6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4js6" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.662947 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0c229880-23d0-4fb6-b111-883f50038f6d-var-lib-openvswitch\") pod \"ovnkube-node-s4js6\" (UID: \"0c229880-23d0-4fb6-b111-883f50038f6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4js6" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.663052 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0c229880-23d0-4fb6-b111-883f50038f6d-host-slash\") pod \"ovnkube-node-s4js6\" (UID: \"0c229880-23d0-4fb6-b111-883f50038f6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4js6" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.663120 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0c229880-23d0-4fb6-b111-883f50038f6d-host-run-netns\") pod \"ovnkube-node-s4js6\" (UID: \"0c229880-23d0-4fb6-b111-883f50038f6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4js6" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.663165 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0c229880-23d0-4fb6-b111-883f50038f6d-host-run-ovn-kubernetes\") pod \"ovnkube-node-s4js6\" (UID: \"0c229880-23d0-4fb6-b111-883f50038f6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4js6" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.663189 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0c229880-23d0-4fb6-b111-883f50038f6d-node-log\") pod \"ovnkube-node-s4js6\" (UID: \"0c229880-23d0-4fb6-b111-883f50038f6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4js6" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.663215 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0c229880-23d0-4fb6-b111-883f50038f6d-log-socket\") pod \"ovnkube-node-s4js6\" (UID: \"0c229880-23d0-4fb6-b111-883f50038f6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4js6" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.663245 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0c229880-23d0-4fb6-b111-883f50038f6d-ovnkube-config\") pod \"ovnkube-node-s4js6\" (UID: \"0c229880-23d0-4fb6-b111-883f50038f6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4js6" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.663292 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0c229880-23d0-4fb6-b111-883f50038f6d-run-systemd\") pod \"ovnkube-node-s4js6\" (UID: \"0c229880-23d0-4fb6-b111-883f50038f6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4js6" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.663321 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0c229880-23d0-4fb6-b111-883f50038f6d-env-overrides\") pod \"ovnkube-node-s4js6\" (UID: \"0c229880-23d0-4fb6-b111-883f50038f6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4js6" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.663350 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0c229880-23d0-4fb6-b111-883f50038f6d-host-kubelet\") pod \"ovnkube-node-s4js6\" (UID: \"0c229880-23d0-4fb6-b111-883f50038f6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4js6" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.663396 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0c229880-23d0-4fb6-b111-883f50038f6d-run-openvswitch\") pod \"ovnkube-node-s4js6\" (UID: \"0c229880-23d0-4fb6-b111-883f50038f6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4js6" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.663439 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87nzp\" (UniqueName: \"kubernetes.io/projected/0c229880-23d0-4fb6-b111-883f50038f6d-kube-api-access-87nzp\") pod \"ovnkube-node-s4js6\" (UID: \"0c229880-23d0-4fb6-b111-883f50038f6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4js6" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.663441 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0c229880-23d0-4fb6-b111-883f50038f6d-node-log\") pod \"ovnkube-node-s4js6\" (UID: \"0c229880-23d0-4fb6-b111-883f50038f6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4js6" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.663480 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0c229880-23d0-4fb6-b111-883f50038f6d-run-ovn\") pod \"ovnkube-node-s4js6\" (UID: \"0c229880-23d0-4fb6-b111-883f50038f6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4js6" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.663524 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0c229880-23d0-4fb6-b111-883f50038f6d-systemd-units\") pod \"ovnkube-node-s4js6\" (UID: \"0c229880-23d0-4fb6-b111-883f50038f6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4js6" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.663591 4698 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/066df704-6981-4770-a647-df52a0da50a0-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.663787 4698 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/066df704-6981-4770-a647-df52a0da50a0-log-socket\") on node \"crc\" DevicePath \"\"" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.663792 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0c229880-23d0-4fb6-b111-883f50038f6d-ovnkube-script-lib\") pod \"ovnkube-node-s4js6\" (UID: \"0c229880-23d0-4fb6-b111-883f50038f6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4js6" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.663820 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0c229880-23d0-4fb6-b111-883f50038f6d-run-openvswitch\") pod \"ovnkube-node-s4js6\" (UID: \"0c229880-23d0-4fb6-b111-883f50038f6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4js6" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.663857 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0c229880-23d0-4fb6-b111-883f50038f6d-host-run-netns\") pod \"ovnkube-node-s4js6\" (UID: \"0c229880-23d0-4fb6-b111-883f50038f6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4js6" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.663880 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0c229880-23d0-4fb6-b111-883f50038f6d-run-systemd\") pod \"ovnkube-node-s4js6\" (UID: \"0c229880-23d0-4fb6-b111-883f50038f6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4js6" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.663913 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0c229880-23d0-4fb6-b111-883f50038f6d-host-slash\") pod \"ovnkube-node-s4js6\" (UID: \"0c229880-23d0-4fb6-b111-883f50038f6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4js6" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.663938 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0c229880-23d0-4fb6-b111-883f50038f6d-host-kubelet\") pod \"ovnkube-node-s4js6\" (UID: \"0c229880-23d0-4fb6-b111-883f50038f6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4js6" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.664221 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0c229880-23d0-4fb6-b111-883f50038f6d-host-run-ovn-kubernetes\") pod \"ovnkube-node-s4js6\" (UID: \"0c229880-23d0-4fb6-b111-883f50038f6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4js6" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.664280 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0c229880-23d0-4fb6-b111-883f50038f6d-run-ovn\") pod \"ovnkube-node-s4js6\" (UID: \"0c229880-23d0-4fb6-b111-883f50038f6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4js6" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.664313 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0c229880-23d0-4fb6-b111-883f50038f6d-systemd-units\") pod \"ovnkube-node-s4js6\" (UID: \"0c229880-23d0-4fb6-b111-883f50038f6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4js6" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.664351 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0c229880-23d0-4fb6-b111-883f50038f6d-ovnkube-config\") pod \"ovnkube-node-s4js6\" (UID: \"0c229880-23d0-4fb6-b111-883f50038f6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4js6" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.664387 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0c229880-23d0-4fb6-b111-883f50038f6d-log-socket\") pod \"ovnkube-node-s4js6\" (UID: \"0c229880-23d0-4fb6-b111-883f50038f6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4js6" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.664408 4698 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/066df704-6981-4770-a647-df52a0da50a0-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.664424 4698 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/066df704-6981-4770-a647-df52a0da50a0-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.664426 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0c229880-23d0-4fb6-b111-883f50038f6d-env-overrides\") pod \"ovnkube-node-s4js6\" (UID: \"0c229880-23d0-4fb6-b111-883f50038f6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4js6" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.664437 4698 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/066df704-6981-4770-a647-df52a0da50a0-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.664473 4698 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/066df704-6981-4770-a647-df52a0da50a0-systemd-units\") on node \"crc\" DevicePath \"\"" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.664492 4698 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/066df704-6981-4770-a647-df52a0da50a0-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.664506 4698 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/066df704-6981-4770-a647-df52a0da50a0-host-cni-netd\") on node \"crc\" DevicePath \"\"" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.664521 4698 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/066df704-6981-4770-a647-df52a0da50a0-host-kubelet\") on node \"crc\" DevicePath \"\"" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.664533 4698 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l2k2d\" (UniqueName: \"kubernetes.io/projected/066df704-6981-4770-a647-df52a0da50a0-kube-api-access-l2k2d\") on node \"crc\" DevicePath \"\"" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.664545 4698 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/066df704-6981-4770-a647-df52a0da50a0-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.664556 4698 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/066df704-6981-4770-a647-df52a0da50a0-run-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.664568 4698 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/066df704-6981-4770-a647-df52a0da50a0-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.664581 4698 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/066df704-6981-4770-a647-df52a0da50a0-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.664594 4698 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/066df704-6981-4770-a647-df52a0da50a0-host-slash\") on node \"crc\" DevicePath \"\"" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.664606 4698 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/066df704-6981-4770-a647-df52a0da50a0-node-log\") on node \"crc\" DevicePath \"\"" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.664617 4698 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/066df704-6981-4770-a647-df52a0da50a0-run-systemd\") on node \"crc\" DevicePath \"\"" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.664628 4698 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/066df704-6981-4770-a647-df52a0da50a0-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.666972 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0c229880-23d0-4fb6-b111-883f50038f6d-ovn-node-metrics-cert\") pod \"ovnkube-node-s4js6\" (UID: \"0c229880-23d0-4fb6-b111-883f50038f6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4js6" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.680153 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87nzp\" (UniqueName: \"kubernetes.io/projected/0c229880-23d0-4fb6-b111-883f50038f6d-kube-api-access-87nzp\") pod \"ovnkube-node-s4js6\" (UID: \"0c229880-23d0-4fb6-b111-883f50038f6d\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4js6" Feb 24 10:27:51 crc kubenswrapper[4698]: I0224 10:27:51.863698 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-s4js6" Feb 24 10:27:51 crc kubenswrapper[4698]: W0224 10:27:51.882220 4698 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c229880_23d0_4fb6_b111_883f50038f6d.slice/crio-23cd7d0ee0b37c0034bcf5730346c1ad999e07fa4329fc81a64193a2c009562f WatchSource:0}: Error finding container 23cd7d0ee0b37c0034bcf5730346c1ad999e07fa4329fc81a64193a2c009562f: Status 404 returned error can't find the container with id 23cd7d0ee0b37c0034bcf5730346c1ad999e07fa4329fc81a64193a2c009562f Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.013652 4698 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7mbk6_17dd9ce8-b1ca-4810-85fe-9775919eb4b5/kube-multus/2.log" Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.014462 4698 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7mbk6_17dd9ce8-b1ca-4810-85fe-9775919eb4b5/kube-multus/1.log" Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.014515 4698 generic.go:334] "Generic (PLEG): container finished" podID="17dd9ce8-b1ca-4810-85fe-9775919eb4b5" containerID="ab364baedbeb66518d2c61a0989a799a3a60377047595973f394b87edd9b060a" exitCode=2 Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.014581 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7mbk6" event={"ID":"17dd9ce8-b1ca-4810-85fe-9775919eb4b5","Type":"ContainerDied","Data":"ab364baedbeb66518d2c61a0989a799a3a60377047595973f394b87edd9b060a"} Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.014629 4698 scope.go:117] "RemoveContainer" containerID="26cc85a7a79119a1df0de0f47a3098d7417118ce0da5b300f453a3d8c4f351a7" Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.015150 4698 scope.go:117] "RemoveContainer" containerID="ab364baedbeb66518d2c61a0989a799a3a60377047595973f394b87edd9b060a" Feb 24 10:27:52 crc kubenswrapper[4698]: E0224 10:27:52.015399 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-7mbk6_openshift-multus(17dd9ce8-b1ca-4810-85fe-9775919eb4b5)\"" pod="openshift-multus/multus-7mbk6" podUID="17dd9ce8-b1ca-4810-85fe-9775919eb4b5" Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.021441 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s4js6" event={"ID":"0c229880-23d0-4fb6-b111-883f50038f6d","Type":"ContainerStarted","Data":"23cd7d0ee0b37c0034bcf5730346c1ad999e07fa4329fc81a64193a2c009562f"} Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.025446 4698 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mgh7p_066df704-6981-4770-a647-df52a0da50a0/ovnkube-controller/3.log" Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.027798 4698 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mgh7p_066df704-6981-4770-a647-df52a0da50a0/ovn-acl-logging/0.log" Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.028499 4698 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mgh7p_066df704-6981-4770-a647-df52a0da50a0/ovn-controller/0.log" Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.029013 4698 generic.go:334] "Generic (PLEG): container finished" podID="066df704-6981-4770-a647-df52a0da50a0" containerID="1058035c4f9ec53ded52f7e95037f92da16967a87ba1ef415eb2df5ed366da4c" exitCode=0 Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.029037 4698 generic.go:334] "Generic (PLEG): container finished" podID="066df704-6981-4770-a647-df52a0da50a0" containerID="1288272246b8937c2880153451d797fc3328749902e2491e60c8f8f086c85288" exitCode=0 Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.029045 4698 generic.go:334] "Generic (PLEG): container finished" podID="066df704-6981-4770-a647-df52a0da50a0" containerID="7adc5b73bdd01b1e822308534c8848e154a1d05ed5367b971b59a99289387585" exitCode=0 Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.029053 4698 generic.go:334] "Generic (PLEG): container finished" podID="066df704-6981-4770-a647-df52a0da50a0" containerID="f2ec337c851d86c491d1ae5a667e4344ae4759f945b423d3a48838874a6eda20" exitCode=0 Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.029060 4698 generic.go:334] "Generic (PLEG): container finished" podID="066df704-6981-4770-a647-df52a0da50a0" containerID="5e27ae8c6aa803d58f6ff0252273d2fcbbee794c49a13fc54bfe6677b5aa6e07" exitCode=0 Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.029065 4698 generic.go:334] "Generic (PLEG): container finished" podID="066df704-6981-4770-a647-df52a0da50a0" containerID="60215d9a7dc3fbaa1b045a76c018c910f3748c5bef5325716e0a28844bc91ece" exitCode=0 Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.029073 4698 generic.go:334] "Generic (PLEG): container finished" podID="066df704-6981-4770-a647-df52a0da50a0" containerID="444da705b890c795bca82d2bd44ad5b71ed9bcc95a70ee5c92755679af31aa99" exitCode=143 Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.029080 4698 generic.go:334] "Generic (PLEG): container finished" podID="066df704-6981-4770-a647-df52a0da50a0" containerID="096010abeb5f4fc1cf8ab2a1a3e50000365a449d0747081df923bde1be7e1213" exitCode=143 Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.029209 4698 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.029564 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" event={"ID":"066df704-6981-4770-a647-df52a0da50a0","Type":"ContainerDied","Data":"1058035c4f9ec53ded52f7e95037f92da16967a87ba1ef415eb2df5ed366da4c"} Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.029681 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" event={"ID":"066df704-6981-4770-a647-df52a0da50a0","Type":"ContainerDied","Data":"1288272246b8937c2880153451d797fc3328749902e2491e60c8f8f086c85288"} Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.029763 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" event={"ID":"066df704-6981-4770-a647-df52a0da50a0","Type":"ContainerDied","Data":"7adc5b73bdd01b1e822308534c8848e154a1d05ed5367b971b59a99289387585"} Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.029854 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" event={"ID":"066df704-6981-4770-a647-df52a0da50a0","Type":"ContainerDied","Data":"f2ec337c851d86c491d1ae5a667e4344ae4759f945b423d3a48838874a6eda20"} Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.029959 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" event={"ID":"066df704-6981-4770-a647-df52a0da50a0","Type":"ContainerDied","Data":"5e27ae8c6aa803d58f6ff0252273d2fcbbee794c49a13fc54bfe6677b5aa6e07"} Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.030091 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" event={"ID":"066df704-6981-4770-a647-df52a0da50a0","Type":"ContainerDied","Data":"60215d9a7dc3fbaa1b045a76c018c910f3748c5bef5325716e0a28844bc91ece"} Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.030199 4698 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1058035c4f9ec53ded52f7e95037f92da16967a87ba1ef415eb2df5ed366da4c"} Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.030323 4698 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b4f0637ffd869edc84aea294e257ec525bede2fdb6f95377ebe6bf3fb1033d71"} Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.030414 4698 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1288272246b8937c2880153451d797fc3328749902e2491e60c8f8f086c85288"} Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.030488 4698 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7adc5b73bdd01b1e822308534c8848e154a1d05ed5367b971b59a99289387585"} Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.030560 4698 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f2ec337c851d86c491d1ae5a667e4344ae4759f945b423d3a48838874a6eda20"} Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.030633 4698 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5e27ae8c6aa803d58f6ff0252273d2fcbbee794c49a13fc54bfe6677b5aa6e07"} Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.030704 4698 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"60215d9a7dc3fbaa1b045a76c018c910f3748c5bef5325716e0a28844bc91ece"} Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.030773 4698 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"444da705b890c795bca82d2bd44ad5b71ed9bcc95a70ee5c92755679af31aa99"} Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.030842 4698 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"096010abeb5f4fc1cf8ab2a1a3e50000365a449d0747081df923bde1be7e1213"} Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.030915 4698 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"363eade2263b2108feaaf0620f7f1fd910effb90ce635e5b749b59b407618443"} Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.031007 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" event={"ID":"066df704-6981-4770-a647-df52a0da50a0","Type":"ContainerDied","Data":"444da705b890c795bca82d2bd44ad5b71ed9bcc95a70ee5c92755679af31aa99"} Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.031143 4698 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1058035c4f9ec53ded52f7e95037f92da16967a87ba1ef415eb2df5ed366da4c"} Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.031242 4698 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b4f0637ffd869edc84aea294e257ec525bede2fdb6f95377ebe6bf3fb1033d71"} Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.031354 4698 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1288272246b8937c2880153451d797fc3328749902e2491e60c8f8f086c85288"} Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.031424 4698 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7adc5b73bdd01b1e822308534c8848e154a1d05ed5367b971b59a99289387585"} Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.031497 4698 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f2ec337c851d86c491d1ae5a667e4344ae4759f945b423d3a48838874a6eda20"} Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.031566 4698 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5e27ae8c6aa803d58f6ff0252273d2fcbbee794c49a13fc54bfe6677b5aa6e07"} Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.031631 4698 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"60215d9a7dc3fbaa1b045a76c018c910f3748c5bef5325716e0a28844bc91ece"} Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.031694 4698 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"444da705b890c795bca82d2bd44ad5b71ed9bcc95a70ee5c92755679af31aa99"} Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.031757 4698 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"096010abeb5f4fc1cf8ab2a1a3e50000365a449d0747081df923bde1be7e1213"} Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.031909 4698 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"363eade2263b2108feaaf0620f7f1fd910effb90ce635e5b749b59b407618443"} Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.031993 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" event={"ID":"066df704-6981-4770-a647-df52a0da50a0","Type":"ContainerDied","Data":"096010abeb5f4fc1cf8ab2a1a3e50000365a449d0747081df923bde1be7e1213"} Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.032093 4698 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1058035c4f9ec53ded52f7e95037f92da16967a87ba1ef415eb2df5ed366da4c"} Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.032165 4698 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b4f0637ffd869edc84aea294e257ec525bede2fdb6f95377ebe6bf3fb1033d71"} Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.032236 4698 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1288272246b8937c2880153451d797fc3328749902e2491e60c8f8f086c85288"} Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.032326 4698 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7adc5b73bdd01b1e822308534c8848e154a1d05ed5367b971b59a99289387585"} Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.032408 4698 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f2ec337c851d86c491d1ae5a667e4344ae4759f945b423d3a48838874a6eda20"} Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.032480 4698 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5e27ae8c6aa803d58f6ff0252273d2fcbbee794c49a13fc54bfe6677b5aa6e07"} Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.032547 4698 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"60215d9a7dc3fbaa1b045a76c018c910f3748c5bef5325716e0a28844bc91ece"} Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.032610 4698 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"444da705b890c795bca82d2bd44ad5b71ed9bcc95a70ee5c92755679af31aa99"} Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.032673 4698 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"096010abeb5f4fc1cf8ab2a1a3e50000365a449d0747081df923bde1be7e1213"} Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.032737 4698 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"363eade2263b2108feaaf0620f7f1fd910effb90ce635e5b749b59b407618443"} Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.032818 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mgh7p" event={"ID":"066df704-6981-4770-a647-df52a0da50a0","Type":"ContainerDied","Data":"221176da06c75722a417e733f5f7886a0da7218159233b2983b544c5fbd562d4"} Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.032904 4698 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1058035c4f9ec53ded52f7e95037f92da16967a87ba1ef415eb2df5ed366da4c"} Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.033396 4698 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b4f0637ffd869edc84aea294e257ec525bede2fdb6f95377ebe6bf3fb1033d71"} Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.033558 4698 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1288272246b8937c2880153451d797fc3328749902e2491e60c8f8f086c85288"} Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.034301 4698 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7adc5b73bdd01b1e822308534c8848e154a1d05ed5367b971b59a99289387585"} Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.034407 4698 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f2ec337c851d86c491d1ae5a667e4344ae4759f945b423d3a48838874a6eda20"} Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.034483 4698 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5e27ae8c6aa803d58f6ff0252273d2fcbbee794c49a13fc54bfe6677b5aa6e07"} Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.034551 4698 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"60215d9a7dc3fbaa1b045a76c018c910f3748c5bef5325716e0a28844bc91ece"} Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.034615 4698 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"444da705b890c795bca82d2bd44ad5b71ed9bcc95a70ee5c92755679af31aa99"} Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.034679 4698 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"096010abeb5f4fc1cf8ab2a1a3e50000365a449d0747081df923bde1be7e1213"} Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.034759 4698 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"363eade2263b2108feaaf0620f7f1fd910effb90ce635e5b749b59b407618443"} Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.110471 4698 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-mgh7p"] Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.113395 4698 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-mgh7p"] Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.116231 4698 scope.go:117] "RemoveContainer" containerID="1058035c4f9ec53ded52f7e95037f92da16967a87ba1ef415eb2df5ed366da4c" Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.139542 4698 scope.go:117] "RemoveContainer" containerID="b4f0637ffd869edc84aea294e257ec525bede2fdb6f95377ebe6bf3fb1033d71" Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.160728 4698 scope.go:117] "RemoveContainer" containerID="1288272246b8937c2880153451d797fc3328749902e2491e60c8f8f086c85288" Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.174283 4698 scope.go:117] "RemoveContainer" containerID="7adc5b73bdd01b1e822308534c8848e154a1d05ed5367b971b59a99289387585" Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.188043 4698 scope.go:117] "RemoveContainer" containerID="f2ec337c851d86c491d1ae5a667e4344ae4759f945b423d3a48838874a6eda20" Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.205227 4698 scope.go:117] "RemoveContainer" containerID="5e27ae8c6aa803d58f6ff0252273d2fcbbee794c49a13fc54bfe6677b5aa6e07" Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.268752 4698 scope.go:117] "RemoveContainer" containerID="60215d9a7dc3fbaa1b045a76c018c910f3748c5bef5325716e0a28844bc91ece" Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.287674 4698 scope.go:117] "RemoveContainer" containerID="444da705b890c795bca82d2bd44ad5b71ed9bcc95a70ee5c92755679af31aa99" Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.306091 4698 scope.go:117] "RemoveContainer" containerID="096010abeb5f4fc1cf8ab2a1a3e50000365a449d0747081df923bde1be7e1213" Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.325433 4698 scope.go:117] "RemoveContainer" containerID="363eade2263b2108feaaf0620f7f1fd910effb90ce635e5b749b59b407618443" Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.338957 4698 scope.go:117] "RemoveContainer" containerID="1058035c4f9ec53ded52f7e95037f92da16967a87ba1ef415eb2df5ed366da4c" Feb 24 10:27:52 crc kubenswrapper[4698]: E0224 10:27:52.339727 4698 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1058035c4f9ec53ded52f7e95037f92da16967a87ba1ef415eb2df5ed366da4c\": container with ID starting with 1058035c4f9ec53ded52f7e95037f92da16967a87ba1ef415eb2df5ed366da4c not found: ID does not exist" containerID="1058035c4f9ec53ded52f7e95037f92da16967a87ba1ef415eb2df5ed366da4c" Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.339772 4698 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1058035c4f9ec53ded52f7e95037f92da16967a87ba1ef415eb2df5ed366da4c"} err="failed to get container status \"1058035c4f9ec53ded52f7e95037f92da16967a87ba1ef415eb2df5ed366da4c\": rpc error: code = NotFound desc = could not find container \"1058035c4f9ec53ded52f7e95037f92da16967a87ba1ef415eb2df5ed366da4c\": container with ID starting with 1058035c4f9ec53ded52f7e95037f92da16967a87ba1ef415eb2df5ed366da4c not found: ID does not exist" Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.339802 4698 scope.go:117] "RemoveContainer" containerID="b4f0637ffd869edc84aea294e257ec525bede2fdb6f95377ebe6bf3fb1033d71" Feb 24 10:27:52 crc kubenswrapper[4698]: E0224 10:27:52.340244 4698 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4f0637ffd869edc84aea294e257ec525bede2fdb6f95377ebe6bf3fb1033d71\": container with ID starting with b4f0637ffd869edc84aea294e257ec525bede2fdb6f95377ebe6bf3fb1033d71 not found: ID does not exist" containerID="b4f0637ffd869edc84aea294e257ec525bede2fdb6f95377ebe6bf3fb1033d71" Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.340350 4698 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4f0637ffd869edc84aea294e257ec525bede2fdb6f95377ebe6bf3fb1033d71"} err="failed to get container status \"b4f0637ffd869edc84aea294e257ec525bede2fdb6f95377ebe6bf3fb1033d71\": rpc error: code = NotFound desc = could not find container \"b4f0637ffd869edc84aea294e257ec525bede2fdb6f95377ebe6bf3fb1033d71\": container with ID starting with b4f0637ffd869edc84aea294e257ec525bede2fdb6f95377ebe6bf3fb1033d71 not found: ID does not exist" Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.340394 4698 scope.go:117] "RemoveContainer" containerID="1288272246b8937c2880153451d797fc3328749902e2491e60c8f8f086c85288" Feb 24 10:27:52 crc kubenswrapper[4698]: E0224 10:27:52.341088 4698 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1288272246b8937c2880153451d797fc3328749902e2491e60c8f8f086c85288\": container with ID starting with 1288272246b8937c2880153451d797fc3328749902e2491e60c8f8f086c85288 not found: ID does not exist" containerID="1288272246b8937c2880153451d797fc3328749902e2491e60c8f8f086c85288" Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.341117 4698 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1288272246b8937c2880153451d797fc3328749902e2491e60c8f8f086c85288"} err="failed to get container status \"1288272246b8937c2880153451d797fc3328749902e2491e60c8f8f086c85288\": rpc error: code = NotFound desc = could not find container \"1288272246b8937c2880153451d797fc3328749902e2491e60c8f8f086c85288\": container with ID starting with 1288272246b8937c2880153451d797fc3328749902e2491e60c8f8f086c85288 not found: ID does not exist" Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.341132 4698 scope.go:117] "RemoveContainer" containerID="7adc5b73bdd01b1e822308534c8848e154a1d05ed5367b971b59a99289387585" Feb 24 10:27:52 crc kubenswrapper[4698]: E0224 10:27:52.341486 4698 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7adc5b73bdd01b1e822308534c8848e154a1d05ed5367b971b59a99289387585\": container with ID starting with 7adc5b73bdd01b1e822308534c8848e154a1d05ed5367b971b59a99289387585 not found: ID does not exist" containerID="7adc5b73bdd01b1e822308534c8848e154a1d05ed5367b971b59a99289387585" Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.341538 4698 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7adc5b73bdd01b1e822308534c8848e154a1d05ed5367b971b59a99289387585"} err="failed to get container status \"7adc5b73bdd01b1e822308534c8848e154a1d05ed5367b971b59a99289387585\": rpc error: code = NotFound desc = could not find container \"7adc5b73bdd01b1e822308534c8848e154a1d05ed5367b971b59a99289387585\": container with ID starting with 7adc5b73bdd01b1e822308534c8848e154a1d05ed5367b971b59a99289387585 not found: ID does not exist" Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.341570 4698 scope.go:117] "RemoveContainer" containerID="f2ec337c851d86c491d1ae5a667e4344ae4759f945b423d3a48838874a6eda20" Feb 24 10:27:52 crc kubenswrapper[4698]: E0224 10:27:52.341841 4698 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2ec337c851d86c491d1ae5a667e4344ae4759f945b423d3a48838874a6eda20\": container with ID starting with f2ec337c851d86c491d1ae5a667e4344ae4759f945b423d3a48838874a6eda20 not found: ID does not exist" containerID="f2ec337c851d86c491d1ae5a667e4344ae4759f945b423d3a48838874a6eda20" Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.341877 4698 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2ec337c851d86c491d1ae5a667e4344ae4759f945b423d3a48838874a6eda20"} err="failed to get container status \"f2ec337c851d86c491d1ae5a667e4344ae4759f945b423d3a48838874a6eda20\": rpc error: code = NotFound desc = could not find container \"f2ec337c851d86c491d1ae5a667e4344ae4759f945b423d3a48838874a6eda20\": container with ID starting with f2ec337c851d86c491d1ae5a667e4344ae4759f945b423d3a48838874a6eda20 not found: ID does not exist" Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.341900 4698 scope.go:117] "RemoveContainer" containerID="5e27ae8c6aa803d58f6ff0252273d2fcbbee794c49a13fc54bfe6677b5aa6e07" Feb 24 10:27:52 crc kubenswrapper[4698]: E0224 10:27:52.342327 4698 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e27ae8c6aa803d58f6ff0252273d2fcbbee794c49a13fc54bfe6677b5aa6e07\": container with ID starting with 5e27ae8c6aa803d58f6ff0252273d2fcbbee794c49a13fc54bfe6677b5aa6e07 not found: ID does not exist" containerID="5e27ae8c6aa803d58f6ff0252273d2fcbbee794c49a13fc54bfe6677b5aa6e07" Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.342377 4698 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e27ae8c6aa803d58f6ff0252273d2fcbbee794c49a13fc54bfe6677b5aa6e07"} err="failed to get container status \"5e27ae8c6aa803d58f6ff0252273d2fcbbee794c49a13fc54bfe6677b5aa6e07\": rpc error: code = NotFound desc = could not find container \"5e27ae8c6aa803d58f6ff0252273d2fcbbee794c49a13fc54bfe6677b5aa6e07\": container with ID starting with 5e27ae8c6aa803d58f6ff0252273d2fcbbee794c49a13fc54bfe6677b5aa6e07 not found: ID does not exist" Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.342411 4698 scope.go:117] "RemoveContainer" containerID="60215d9a7dc3fbaa1b045a76c018c910f3748c5bef5325716e0a28844bc91ece" Feb 24 10:27:52 crc kubenswrapper[4698]: E0224 10:27:52.342783 4698 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60215d9a7dc3fbaa1b045a76c018c910f3748c5bef5325716e0a28844bc91ece\": container with ID starting with 60215d9a7dc3fbaa1b045a76c018c910f3748c5bef5325716e0a28844bc91ece not found: ID does not exist" containerID="60215d9a7dc3fbaa1b045a76c018c910f3748c5bef5325716e0a28844bc91ece" Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.342811 4698 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60215d9a7dc3fbaa1b045a76c018c910f3748c5bef5325716e0a28844bc91ece"} err="failed to get container status \"60215d9a7dc3fbaa1b045a76c018c910f3748c5bef5325716e0a28844bc91ece\": rpc error: code = NotFound desc = could not find container \"60215d9a7dc3fbaa1b045a76c018c910f3748c5bef5325716e0a28844bc91ece\": container with ID starting with 60215d9a7dc3fbaa1b045a76c018c910f3748c5bef5325716e0a28844bc91ece not found: ID does not exist" Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.342826 4698 scope.go:117] "RemoveContainer" containerID="444da705b890c795bca82d2bd44ad5b71ed9bcc95a70ee5c92755679af31aa99" Feb 24 10:27:52 crc kubenswrapper[4698]: E0224 10:27:52.343183 4698 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"444da705b890c795bca82d2bd44ad5b71ed9bcc95a70ee5c92755679af31aa99\": container with ID starting with 444da705b890c795bca82d2bd44ad5b71ed9bcc95a70ee5c92755679af31aa99 not found: ID does not exist" containerID="444da705b890c795bca82d2bd44ad5b71ed9bcc95a70ee5c92755679af31aa99" Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.343215 4698 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"444da705b890c795bca82d2bd44ad5b71ed9bcc95a70ee5c92755679af31aa99"} err="failed to get container status \"444da705b890c795bca82d2bd44ad5b71ed9bcc95a70ee5c92755679af31aa99\": rpc error: code = NotFound desc = could not find container \"444da705b890c795bca82d2bd44ad5b71ed9bcc95a70ee5c92755679af31aa99\": container with ID starting with 444da705b890c795bca82d2bd44ad5b71ed9bcc95a70ee5c92755679af31aa99 not found: ID does not exist" Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.343236 4698 scope.go:117] "RemoveContainer" containerID="096010abeb5f4fc1cf8ab2a1a3e50000365a449d0747081df923bde1be7e1213" Feb 24 10:27:52 crc kubenswrapper[4698]: E0224 10:27:52.343619 4698 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"096010abeb5f4fc1cf8ab2a1a3e50000365a449d0747081df923bde1be7e1213\": container with ID starting with 096010abeb5f4fc1cf8ab2a1a3e50000365a449d0747081df923bde1be7e1213 not found: ID does not exist" containerID="096010abeb5f4fc1cf8ab2a1a3e50000365a449d0747081df923bde1be7e1213" Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.343645 4698 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"096010abeb5f4fc1cf8ab2a1a3e50000365a449d0747081df923bde1be7e1213"} err="failed to get container status \"096010abeb5f4fc1cf8ab2a1a3e50000365a449d0747081df923bde1be7e1213\": rpc error: code = NotFound desc = could not find container \"096010abeb5f4fc1cf8ab2a1a3e50000365a449d0747081df923bde1be7e1213\": container with ID starting with 096010abeb5f4fc1cf8ab2a1a3e50000365a449d0747081df923bde1be7e1213 not found: ID does not exist" Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.343665 4698 scope.go:117] "RemoveContainer" containerID="363eade2263b2108feaaf0620f7f1fd910effb90ce635e5b749b59b407618443" Feb 24 10:27:52 crc kubenswrapper[4698]: E0224 10:27:52.343925 4698 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"363eade2263b2108feaaf0620f7f1fd910effb90ce635e5b749b59b407618443\": container with ID starting with 363eade2263b2108feaaf0620f7f1fd910effb90ce635e5b749b59b407618443 not found: ID does not exist" containerID="363eade2263b2108feaaf0620f7f1fd910effb90ce635e5b749b59b407618443" Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.343951 4698 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"363eade2263b2108feaaf0620f7f1fd910effb90ce635e5b749b59b407618443"} err="failed to get container status \"363eade2263b2108feaaf0620f7f1fd910effb90ce635e5b749b59b407618443\": rpc error: code = NotFound desc = could not find container \"363eade2263b2108feaaf0620f7f1fd910effb90ce635e5b749b59b407618443\": container with ID starting with 363eade2263b2108feaaf0620f7f1fd910effb90ce635e5b749b59b407618443 not found: ID does not exist" Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.343965 4698 scope.go:117] "RemoveContainer" containerID="1058035c4f9ec53ded52f7e95037f92da16967a87ba1ef415eb2df5ed366da4c" Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.344382 4698 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1058035c4f9ec53ded52f7e95037f92da16967a87ba1ef415eb2df5ed366da4c"} err="failed to get container status \"1058035c4f9ec53ded52f7e95037f92da16967a87ba1ef415eb2df5ed366da4c\": rpc error: code = NotFound desc = could not find container \"1058035c4f9ec53ded52f7e95037f92da16967a87ba1ef415eb2df5ed366da4c\": container with ID starting with 1058035c4f9ec53ded52f7e95037f92da16967a87ba1ef415eb2df5ed366da4c not found: ID does not exist" Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.344427 4698 scope.go:117] "RemoveContainer" containerID="b4f0637ffd869edc84aea294e257ec525bede2fdb6f95377ebe6bf3fb1033d71" Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.344824 4698 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4f0637ffd869edc84aea294e257ec525bede2fdb6f95377ebe6bf3fb1033d71"} err="failed to get container status \"b4f0637ffd869edc84aea294e257ec525bede2fdb6f95377ebe6bf3fb1033d71\": rpc error: code = NotFound desc = could not find container \"b4f0637ffd869edc84aea294e257ec525bede2fdb6f95377ebe6bf3fb1033d71\": container with ID starting with b4f0637ffd869edc84aea294e257ec525bede2fdb6f95377ebe6bf3fb1033d71 not found: ID does not exist" Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.344851 4698 scope.go:117] "RemoveContainer" containerID="1288272246b8937c2880153451d797fc3328749902e2491e60c8f8f086c85288" Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.345188 4698 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1288272246b8937c2880153451d797fc3328749902e2491e60c8f8f086c85288"} err="failed to get container status \"1288272246b8937c2880153451d797fc3328749902e2491e60c8f8f086c85288\": rpc error: code = NotFound desc = could not find container \"1288272246b8937c2880153451d797fc3328749902e2491e60c8f8f086c85288\": container with ID starting with 1288272246b8937c2880153451d797fc3328749902e2491e60c8f8f086c85288 not found: ID does not exist" Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.345220 4698 scope.go:117] "RemoveContainer" containerID="7adc5b73bdd01b1e822308534c8848e154a1d05ed5367b971b59a99289387585" Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.345710 4698 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7adc5b73bdd01b1e822308534c8848e154a1d05ed5367b971b59a99289387585"} err="failed to get container status \"7adc5b73bdd01b1e822308534c8848e154a1d05ed5367b971b59a99289387585\": rpc error: code = NotFound desc = could not find container \"7adc5b73bdd01b1e822308534c8848e154a1d05ed5367b971b59a99289387585\": container with ID starting with 7adc5b73bdd01b1e822308534c8848e154a1d05ed5367b971b59a99289387585 not found: ID does not exist" Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.345828 4698 scope.go:117] "RemoveContainer" containerID="f2ec337c851d86c491d1ae5a667e4344ae4759f945b423d3a48838874a6eda20" Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.346400 4698 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2ec337c851d86c491d1ae5a667e4344ae4759f945b423d3a48838874a6eda20"} err="failed to get container status \"f2ec337c851d86c491d1ae5a667e4344ae4759f945b423d3a48838874a6eda20\": rpc error: code = NotFound desc = could not find container \"f2ec337c851d86c491d1ae5a667e4344ae4759f945b423d3a48838874a6eda20\": container with ID starting with f2ec337c851d86c491d1ae5a667e4344ae4759f945b423d3a48838874a6eda20 not found: ID does not exist" Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.346422 4698 scope.go:117] "RemoveContainer" containerID="5e27ae8c6aa803d58f6ff0252273d2fcbbee794c49a13fc54bfe6677b5aa6e07" Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.346788 4698 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e27ae8c6aa803d58f6ff0252273d2fcbbee794c49a13fc54bfe6677b5aa6e07"} err="failed to get container status \"5e27ae8c6aa803d58f6ff0252273d2fcbbee794c49a13fc54bfe6677b5aa6e07\": rpc error: code = NotFound desc = could not find container \"5e27ae8c6aa803d58f6ff0252273d2fcbbee794c49a13fc54bfe6677b5aa6e07\": container with ID starting with 5e27ae8c6aa803d58f6ff0252273d2fcbbee794c49a13fc54bfe6677b5aa6e07 not found: ID does not exist" Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.346817 4698 scope.go:117] "RemoveContainer" containerID="60215d9a7dc3fbaa1b045a76c018c910f3748c5bef5325716e0a28844bc91ece" Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.347160 4698 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60215d9a7dc3fbaa1b045a76c018c910f3748c5bef5325716e0a28844bc91ece"} err="failed to get container status \"60215d9a7dc3fbaa1b045a76c018c910f3748c5bef5325716e0a28844bc91ece\": rpc error: code = NotFound desc = could not find container \"60215d9a7dc3fbaa1b045a76c018c910f3748c5bef5325716e0a28844bc91ece\": container with ID starting with 60215d9a7dc3fbaa1b045a76c018c910f3748c5bef5325716e0a28844bc91ece not found: ID does not exist" Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.347207 4698 scope.go:117] "RemoveContainer" containerID="444da705b890c795bca82d2bd44ad5b71ed9bcc95a70ee5c92755679af31aa99" Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.347576 4698 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"444da705b890c795bca82d2bd44ad5b71ed9bcc95a70ee5c92755679af31aa99"} err="failed to get container status \"444da705b890c795bca82d2bd44ad5b71ed9bcc95a70ee5c92755679af31aa99\": rpc error: code = NotFound desc = could not find container \"444da705b890c795bca82d2bd44ad5b71ed9bcc95a70ee5c92755679af31aa99\": container with ID starting with 444da705b890c795bca82d2bd44ad5b71ed9bcc95a70ee5c92755679af31aa99 not found: ID does not exist" Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.347594 4698 scope.go:117] "RemoveContainer" containerID="096010abeb5f4fc1cf8ab2a1a3e50000365a449d0747081df923bde1be7e1213" Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.347914 4698 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"096010abeb5f4fc1cf8ab2a1a3e50000365a449d0747081df923bde1be7e1213"} err="failed to get container status \"096010abeb5f4fc1cf8ab2a1a3e50000365a449d0747081df923bde1be7e1213\": rpc error: code = NotFound desc = could not find container \"096010abeb5f4fc1cf8ab2a1a3e50000365a449d0747081df923bde1be7e1213\": container with ID starting with 096010abeb5f4fc1cf8ab2a1a3e50000365a449d0747081df923bde1be7e1213 not found: ID does not exist" Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.347942 4698 scope.go:117] "RemoveContainer" containerID="363eade2263b2108feaaf0620f7f1fd910effb90ce635e5b749b59b407618443" Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.348246 4698 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"363eade2263b2108feaaf0620f7f1fd910effb90ce635e5b749b59b407618443"} err="failed to get container status \"363eade2263b2108feaaf0620f7f1fd910effb90ce635e5b749b59b407618443\": rpc error: code = NotFound desc = could not find container \"363eade2263b2108feaaf0620f7f1fd910effb90ce635e5b749b59b407618443\": container with ID starting with 363eade2263b2108feaaf0620f7f1fd910effb90ce635e5b749b59b407618443 not found: ID does not exist" Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.348315 4698 scope.go:117] "RemoveContainer" containerID="1058035c4f9ec53ded52f7e95037f92da16967a87ba1ef415eb2df5ed366da4c" Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.348731 4698 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1058035c4f9ec53ded52f7e95037f92da16967a87ba1ef415eb2df5ed366da4c"} err="failed to get container status \"1058035c4f9ec53ded52f7e95037f92da16967a87ba1ef415eb2df5ed366da4c\": rpc error: code = NotFound desc = could not find container \"1058035c4f9ec53ded52f7e95037f92da16967a87ba1ef415eb2df5ed366da4c\": container with ID starting with 1058035c4f9ec53ded52f7e95037f92da16967a87ba1ef415eb2df5ed366da4c not found: ID does not exist" Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.348754 4698 scope.go:117] "RemoveContainer" containerID="b4f0637ffd869edc84aea294e257ec525bede2fdb6f95377ebe6bf3fb1033d71" Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.349033 4698 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4f0637ffd869edc84aea294e257ec525bede2fdb6f95377ebe6bf3fb1033d71"} err="failed to get container status \"b4f0637ffd869edc84aea294e257ec525bede2fdb6f95377ebe6bf3fb1033d71\": rpc error: code = NotFound desc = could not find container \"b4f0637ffd869edc84aea294e257ec525bede2fdb6f95377ebe6bf3fb1033d71\": container with ID starting with b4f0637ffd869edc84aea294e257ec525bede2fdb6f95377ebe6bf3fb1033d71 not found: ID does not exist" Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.349063 4698 scope.go:117] "RemoveContainer" containerID="1288272246b8937c2880153451d797fc3328749902e2491e60c8f8f086c85288" Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.349427 4698 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1288272246b8937c2880153451d797fc3328749902e2491e60c8f8f086c85288"} err="failed to get container status \"1288272246b8937c2880153451d797fc3328749902e2491e60c8f8f086c85288\": rpc error: code = NotFound desc = could not find container \"1288272246b8937c2880153451d797fc3328749902e2491e60c8f8f086c85288\": container with ID starting with 1288272246b8937c2880153451d797fc3328749902e2491e60c8f8f086c85288 not found: ID does not exist" Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.349471 4698 scope.go:117] "RemoveContainer" containerID="7adc5b73bdd01b1e822308534c8848e154a1d05ed5367b971b59a99289387585" Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.349829 4698 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7adc5b73bdd01b1e822308534c8848e154a1d05ed5367b971b59a99289387585"} err="failed to get container status \"7adc5b73bdd01b1e822308534c8848e154a1d05ed5367b971b59a99289387585\": rpc error: code = NotFound desc = could not find container \"7adc5b73bdd01b1e822308534c8848e154a1d05ed5367b971b59a99289387585\": container with ID starting with 7adc5b73bdd01b1e822308534c8848e154a1d05ed5367b971b59a99289387585 not found: ID does not exist" Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.349868 4698 scope.go:117] "RemoveContainer" containerID="f2ec337c851d86c491d1ae5a667e4344ae4759f945b423d3a48838874a6eda20" Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.350111 4698 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2ec337c851d86c491d1ae5a667e4344ae4759f945b423d3a48838874a6eda20"} err="failed to get container status \"f2ec337c851d86c491d1ae5a667e4344ae4759f945b423d3a48838874a6eda20\": rpc error: code = NotFound desc = could not find container \"f2ec337c851d86c491d1ae5a667e4344ae4759f945b423d3a48838874a6eda20\": container with ID starting with f2ec337c851d86c491d1ae5a667e4344ae4759f945b423d3a48838874a6eda20 not found: ID does not exist" Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.350140 4698 scope.go:117] "RemoveContainer" containerID="5e27ae8c6aa803d58f6ff0252273d2fcbbee794c49a13fc54bfe6677b5aa6e07" Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.350370 4698 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e27ae8c6aa803d58f6ff0252273d2fcbbee794c49a13fc54bfe6677b5aa6e07"} err="failed to get container status \"5e27ae8c6aa803d58f6ff0252273d2fcbbee794c49a13fc54bfe6677b5aa6e07\": rpc error: code = NotFound desc = could not find container \"5e27ae8c6aa803d58f6ff0252273d2fcbbee794c49a13fc54bfe6677b5aa6e07\": container with ID starting with 5e27ae8c6aa803d58f6ff0252273d2fcbbee794c49a13fc54bfe6677b5aa6e07 not found: ID does not exist" Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.350395 4698 scope.go:117] "RemoveContainer" containerID="60215d9a7dc3fbaa1b045a76c018c910f3748c5bef5325716e0a28844bc91ece" Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.350582 4698 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60215d9a7dc3fbaa1b045a76c018c910f3748c5bef5325716e0a28844bc91ece"} err="failed to get container status \"60215d9a7dc3fbaa1b045a76c018c910f3748c5bef5325716e0a28844bc91ece\": rpc error: code = NotFound desc = could not find container \"60215d9a7dc3fbaa1b045a76c018c910f3748c5bef5325716e0a28844bc91ece\": container with ID starting with 60215d9a7dc3fbaa1b045a76c018c910f3748c5bef5325716e0a28844bc91ece not found: ID does not exist" Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.350598 4698 scope.go:117] "RemoveContainer" containerID="444da705b890c795bca82d2bd44ad5b71ed9bcc95a70ee5c92755679af31aa99" Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.350827 4698 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"444da705b890c795bca82d2bd44ad5b71ed9bcc95a70ee5c92755679af31aa99"} err="failed to get container status \"444da705b890c795bca82d2bd44ad5b71ed9bcc95a70ee5c92755679af31aa99\": rpc error: code = NotFound desc = could not find container \"444da705b890c795bca82d2bd44ad5b71ed9bcc95a70ee5c92755679af31aa99\": container with ID starting with 444da705b890c795bca82d2bd44ad5b71ed9bcc95a70ee5c92755679af31aa99 not found: ID does not exist" Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.350847 4698 scope.go:117] "RemoveContainer" containerID="096010abeb5f4fc1cf8ab2a1a3e50000365a449d0747081df923bde1be7e1213" Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.351031 4698 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"096010abeb5f4fc1cf8ab2a1a3e50000365a449d0747081df923bde1be7e1213"} err="failed to get container status \"096010abeb5f4fc1cf8ab2a1a3e50000365a449d0747081df923bde1be7e1213\": rpc error: code = NotFound desc = could not find container \"096010abeb5f4fc1cf8ab2a1a3e50000365a449d0747081df923bde1be7e1213\": container with ID starting with 096010abeb5f4fc1cf8ab2a1a3e50000365a449d0747081df923bde1be7e1213 not found: ID does not exist" Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.351045 4698 scope.go:117] "RemoveContainer" containerID="363eade2263b2108feaaf0620f7f1fd910effb90ce635e5b749b59b407618443" Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.351307 4698 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"363eade2263b2108feaaf0620f7f1fd910effb90ce635e5b749b59b407618443"} err="failed to get container status \"363eade2263b2108feaaf0620f7f1fd910effb90ce635e5b749b59b407618443\": rpc error: code = NotFound desc = could not find container \"363eade2263b2108feaaf0620f7f1fd910effb90ce635e5b749b59b407618443\": container with ID starting with 363eade2263b2108feaaf0620f7f1fd910effb90ce635e5b749b59b407618443 not found: ID does not exist" Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.351321 4698 scope.go:117] "RemoveContainer" containerID="1058035c4f9ec53ded52f7e95037f92da16967a87ba1ef415eb2df5ed366da4c" Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.351528 4698 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1058035c4f9ec53ded52f7e95037f92da16967a87ba1ef415eb2df5ed366da4c"} err="failed to get container status \"1058035c4f9ec53ded52f7e95037f92da16967a87ba1ef415eb2df5ed366da4c\": rpc error: code = NotFound desc = could not find container \"1058035c4f9ec53ded52f7e95037f92da16967a87ba1ef415eb2df5ed366da4c\": container with ID starting with 1058035c4f9ec53ded52f7e95037f92da16967a87ba1ef415eb2df5ed366da4c not found: ID does not exist" Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.351545 4698 scope.go:117] "RemoveContainer" containerID="b4f0637ffd869edc84aea294e257ec525bede2fdb6f95377ebe6bf3fb1033d71" Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.351746 4698 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4f0637ffd869edc84aea294e257ec525bede2fdb6f95377ebe6bf3fb1033d71"} err="failed to get container status \"b4f0637ffd869edc84aea294e257ec525bede2fdb6f95377ebe6bf3fb1033d71\": rpc error: code = NotFound desc = could not find container \"b4f0637ffd869edc84aea294e257ec525bede2fdb6f95377ebe6bf3fb1033d71\": container with ID starting with b4f0637ffd869edc84aea294e257ec525bede2fdb6f95377ebe6bf3fb1033d71 not found: ID does not exist" Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.351764 4698 scope.go:117] "RemoveContainer" containerID="1288272246b8937c2880153451d797fc3328749902e2491e60c8f8f086c85288" Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.351964 4698 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1288272246b8937c2880153451d797fc3328749902e2491e60c8f8f086c85288"} err="failed to get container status \"1288272246b8937c2880153451d797fc3328749902e2491e60c8f8f086c85288\": rpc error: code = NotFound desc = could not find container \"1288272246b8937c2880153451d797fc3328749902e2491e60c8f8f086c85288\": container with ID starting with 1288272246b8937c2880153451d797fc3328749902e2491e60c8f8f086c85288 not found: ID does not exist" Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.351985 4698 scope.go:117] "RemoveContainer" containerID="7adc5b73bdd01b1e822308534c8848e154a1d05ed5367b971b59a99289387585" Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.352313 4698 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7adc5b73bdd01b1e822308534c8848e154a1d05ed5367b971b59a99289387585"} err="failed to get container status \"7adc5b73bdd01b1e822308534c8848e154a1d05ed5367b971b59a99289387585\": rpc error: code = NotFound desc = could not find container \"7adc5b73bdd01b1e822308534c8848e154a1d05ed5367b971b59a99289387585\": container with ID starting with 7adc5b73bdd01b1e822308534c8848e154a1d05ed5367b971b59a99289387585 not found: ID does not exist" Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.352360 4698 scope.go:117] "RemoveContainer" containerID="f2ec337c851d86c491d1ae5a667e4344ae4759f945b423d3a48838874a6eda20" Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.352598 4698 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2ec337c851d86c491d1ae5a667e4344ae4759f945b423d3a48838874a6eda20"} err="failed to get container status \"f2ec337c851d86c491d1ae5a667e4344ae4759f945b423d3a48838874a6eda20\": rpc error: code = NotFound desc = could not find container \"f2ec337c851d86c491d1ae5a667e4344ae4759f945b423d3a48838874a6eda20\": container with ID starting with f2ec337c851d86c491d1ae5a667e4344ae4759f945b423d3a48838874a6eda20 not found: ID does not exist" Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.352619 4698 scope.go:117] "RemoveContainer" containerID="5e27ae8c6aa803d58f6ff0252273d2fcbbee794c49a13fc54bfe6677b5aa6e07" Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.353011 4698 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e27ae8c6aa803d58f6ff0252273d2fcbbee794c49a13fc54bfe6677b5aa6e07"} err="failed to get container status \"5e27ae8c6aa803d58f6ff0252273d2fcbbee794c49a13fc54bfe6677b5aa6e07\": rpc error: code = NotFound desc = could not find container \"5e27ae8c6aa803d58f6ff0252273d2fcbbee794c49a13fc54bfe6677b5aa6e07\": container with ID starting with 5e27ae8c6aa803d58f6ff0252273d2fcbbee794c49a13fc54bfe6677b5aa6e07 not found: ID does not exist" Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.353034 4698 scope.go:117] "RemoveContainer" containerID="60215d9a7dc3fbaa1b045a76c018c910f3748c5bef5325716e0a28844bc91ece" Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.353322 4698 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60215d9a7dc3fbaa1b045a76c018c910f3748c5bef5325716e0a28844bc91ece"} err="failed to get container status \"60215d9a7dc3fbaa1b045a76c018c910f3748c5bef5325716e0a28844bc91ece\": rpc error: code = NotFound desc = could not find container \"60215d9a7dc3fbaa1b045a76c018c910f3748c5bef5325716e0a28844bc91ece\": container with ID starting with 60215d9a7dc3fbaa1b045a76c018c910f3748c5bef5325716e0a28844bc91ece not found: ID does not exist" Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.353368 4698 scope.go:117] "RemoveContainer" containerID="444da705b890c795bca82d2bd44ad5b71ed9bcc95a70ee5c92755679af31aa99" Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.353630 4698 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"444da705b890c795bca82d2bd44ad5b71ed9bcc95a70ee5c92755679af31aa99"} err="failed to get container status \"444da705b890c795bca82d2bd44ad5b71ed9bcc95a70ee5c92755679af31aa99\": rpc error: code = NotFound desc = could not find container \"444da705b890c795bca82d2bd44ad5b71ed9bcc95a70ee5c92755679af31aa99\": container with ID starting with 444da705b890c795bca82d2bd44ad5b71ed9bcc95a70ee5c92755679af31aa99 not found: ID does not exist" Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.353645 4698 scope.go:117] "RemoveContainer" containerID="096010abeb5f4fc1cf8ab2a1a3e50000365a449d0747081df923bde1be7e1213" Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.354005 4698 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"096010abeb5f4fc1cf8ab2a1a3e50000365a449d0747081df923bde1be7e1213"} err="failed to get container status \"096010abeb5f4fc1cf8ab2a1a3e50000365a449d0747081df923bde1be7e1213\": rpc error: code = NotFound desc = could not find container \"096010abeb5f4fc1cf8ab2a1a3e50000365a449d0747081df923bde1be7e1213\": container with ID starting with 096010abeb5f4fc1cf8ab2a1a3e50000365a449d0747081df923bde1be7e1213 not found: ID does not exist" Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.354025 4698 scope.go:117] "RemoveContainer" containerID="363eade2263b2108feaaf0620f7f1fd910effb90ce635e5b749b59b407618443" Feb 24 10:27:52 crc kubenswrapper[4698]: I0224 10:27:52.354372 4698 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"363eade2263b2108feaaf0620f7f1fd910effb90ce635e5b749b59b407618443"} err="failed to get container status \"363eade2263b2108feaaf0620f7f1fd910effb90ce635e5b749b59b407618443\": rpc error: code = NotFound desc = could not find container \"363eade2263b2108feaaf0620f7f1fd910effb90ce635e5b749b59b407618443\": container with ID starting with 363eade2263b2108feaaf0620f7f1fd910effb90ce635e5b749b59b407618443 not found: ID does not exist" Feb 24 10:27:53 crc kubenswrapper[4698]: I0224 10:27:53.036579 4698 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7mbk6_17dd9ce8-b1ca-4810-85fe-9775919eb4b5/kube-multus/2.log" Feb 24 10:27:53 crc kubenswrapper[4698]: I0224 10:27:53.039047 4698 generic.go:334] "Generic (PLEG): container finished" podID="0c229880-23d0-4fb6-b111-883f50038f6d" containerID="a83260f52cfb4000684ae8d6b34a10724f66a56d250d183efe407068b22b0c96" exitCode=0 Feb 24 10:27:53 crc kubenswrapper[4698]: I0224 10:27:53.039093 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s4js6" event={"ID":"0c229880-23d0-4fb6-b111-883f50038f6d","Type":"ContainerDied","Data":"a83260f52cfb4000684ae8d6b34a10724f66a56d250d183efe407068b22b0c96"} Feb 24 10:27:53 crc kubenswrapper[4698]: I0224 10:27:53.623181 4698 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="066df704-6981-4770-a647-df52a0da50a0" path="/var/lib/kubelet/pods/066df704-6981-4770-a647-df52a0da50a0/volumes" Feb 24 10:27:54 crc kubenswrapper[4698]: I0224 10:27:54.049247 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s4js6" event={"ID":"0c229880-23d0-4fb6-b111-883f50038f6d","Type":"ContainerStarted","Data":"6664b1061c2a8a6cdf008f42622884234dff50047194f83fba3b5362767d6586"} Feb 24 10:27:54 crc kubenswrapper[4698]: I0224 10:27:54.049453 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s4js6" event={"ID":"0c229880-23d0-4fb6-b111-883f50038f6d","Type":"ContainerStarted","Data":"a1b26a6e8e16cde3ff0cdf6dd0057f60a72b3053fc485858b99e507d0e3a24ec"} Feb 24 10:27:54 crc kubenswrapper[4698]: I0224 10:27:54.049512 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s4js6" event={"ID":"0c229880-23d0-4fb6-b111-883f50038f6d","Type":"ContainerStarted","Data":"c7eff72d137c59a5806e9067103fd0d4f84e5ef595273c8844cd5b2b66f845ca"} Feb 24 10:27:54 crc kubenswrapper[4698]: I0224 10:27:54.049588 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s4js6" event={"ID":"0c229880-23d0-4fb6-b111-883f50038f6d","Type":"ContainerStarted","Data":"665103ef32e1fea3a3d57682e6260ce557ab4d06339b3648d3f05b119b5487a0"} Feb 24 10:27:55 crc kubenswrapper[4698]: I0224 10:27:55.063914 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s4js6" event={"ID":"0c229880-23d0-4fb6-b111-883f50038f6d","Type":"ContainerStarted","Data":"440bf9a5a200dc6b40975990e5f1811b00a75fb28612f9fdb552b3d2691f2a97"} Feb 24 10:27:55 crc kubenswrapper[4698]: I0224 10:27:55.063979 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s4js6" event={"ID":"0c229880-23d0-4fb6-b111-883f50038f6d","Type":"ContainerStarted","Data":"00c5a504faf1080950bad63ff934ac7469f3fa6115be779a8fb075a83989b1bb"} Feb 24 10:27:56 crc kubenswrapper[4698]: I0224 10:27:56.248067 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-v4cfs" Feb 24 10:27:57 crc kubenswrapper[4698]: I0224 10:27:57.086076 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s4js6" event={"ID":"0c229880-23d0-4fb6-b111-883f50038f6d","Type":"ContainerStarted","Data":"4c9ce55304ccffe2f418653bea6816445c63e681e1a7fc05dc4f95c8b047309d"} Feb 24 10:27:57 crc kubenswrapper[4698]: I0224 10:27:57.443678 4698 ???:1] "http: TLS handshake error from 192.168.126.11:44636: no serving certificate available for the kubelet" Feb 24 10:27:59 crc kubenswrapper[4698]: I0224 10:27:59.102446 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s4js6" event={"ID":"0c229880-23d0-4fb6-b111-883f50038f6d","Type":"ContainerStarted","Data":"f70950029b2932aa6f65150ed470ce340bde7e4776610701faae23fcdbf144e6"} Feb 24 10:27:59 crc kubenswrapper[4698]: I0224 10:27:59.104819 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-s4js6" Feb 24 10:27:59 crc kubenswrapper[4698]: I0224 10:27:59.104876 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-s4js6" Feb 24 10:27:59 crc kubenswrapper[4698]: I0224 10:27:59.104960 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-s4js6" Feb 24 10:27:59 crc kubenswrapper[4698]: I0224 10:27:59.194307 4698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-s4js6" podStartSLOduration=8.194289031 podStartE2EDuration="8.194289031s" podCreationTimestamp="2026-02-24 10:27:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-24 10:27:59.191676338 +0000 UTC m=+704.305290579" watchObservedRunningTime="2026-02-24 10:27:59.194289031 +0000 UTC m=+704.307903282" Feb 24 10:27:59 crc kubenswrapper[4698]: I0224 10:27:59.199975 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-s4js6" Feb 24 10:27:59 crc kubenswrapper[4698]: I0224 10:27:59.200240 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-s4js6" Feb 24 10:28:03 crc kubenswrapper[4698]: I0224 10:28:03.614376 4698 scope.go:117] "RemoveContainer" containerID="ab364baedbeb66518d2c61a0989a799a3a60377047595973f394b87edd9b060a" Feb 24 10:28:03 crc kubenswrapper[4698]: E0224 10:28:03.615208 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-7mbk6_openshift-multus(17dd9ce8-b1ca-4810-85fe-9775919eb4b5)\"" pod="openshift-multus/multus-7mbk6" podUID="17dd9ce8-b1ca-4810-85fe-9775919eb4b5" Feb 24 10:28:18 crc kubenswrapper[4698]: I0224 10:28:18.615236 4698 scope.go:117] "RemoveContainer" containerID="ab364baedbeb66518d2c61a0989a799a3a60377047595973f394b87edd9b060a" Feb 24 10:28:19 crc kubenswrapper[4698]: I0224 10:28:19.242243 4698 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7mbk6_17dd9ce8-b1ca-4810-85fe-9775919eb4b5/kube-multus/2.log" Feb 24 10:28:19 crc kubenswrapper[4698]: I0224 10:28:19.242684 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7mbk6" event={"ID":"17dd9ce8-b1ca-4810-85fe-9775919eb4b5","Type":"ContainerStarted","Data":"18bec6089d0b64059ec3365259fb5e79623e7d896df0218a9a3cc990bb6c1731"} Feb 24 10:28:21 crc kubenswrapper[4698]: I0224 10:28:21.924452 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-s4js6" Feb 24 10:28:22 crc kubenswrapper[4698]: I0224 10:28:22.191397 4698 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph"] Feb 24 10:28:22 crc kubenswrapper[4698]: I0224 10:28:22.192343 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph" Feb 24 10:28:22 crc kubenswrapper[4698]: I0224 10:28:22.193870 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Feb 24 10:28:22 crc kubenswrapper[4698]: I0224 10:28:22.194389 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Feb 24 10:28:22 crc kubenswrapper[4698]: I0224 10:28:22.194637 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-xrhf6" Feb 24 10:28:22 crc kubenswrapper[4698]: I0224 10:28:22.274095 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/empty-dir/cb7a90fa-4b3f-41d0-92de-153dc40310ad-run\") pod \"ceph\" (UID: \"cb7a90fa-4b3f-41d0-92de-153dc40310ad\") " pod="openstack/ceph" Feb 24 10:28:22 crc kubenswrapper[4698]: I0224 10:28:22.274166 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snp72\" (UniqueName: \"kubernetes.io/projected/cb7a90fa-4b3f-41d0-92de-153dc40310ad-kube-api-access-snp72\") pod \"ceph\" (UID: \"cb7a90fa-4b3f-41d0-92de-153dc40310ad\") " pod="openstack/ceph" Feb 24 10:28:22 crc kubenswrapper[4698]: I0224 10:28:22.274367 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/cb7a90fa-4b3f-41d0-92de-153dc40310ad-data\") pod \"ceph\" (UID: \"cb7a90fa-4b3f-41d0-92de-153dc40310ad\") " pod="openstack/ceph" Feb 24 10:28:22 crc kubenswrapper[4698]: I0224 10:28:22.274474 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log\" (UniqueName: \"kubernetes.io/empty-dir/cb7a90fa-4b3f-41d0-92de-153dc40310ad-log\") pod \"ceph\" (UID: \"cb7a90fa-4b3f-41d0-92de-153dc40310ad\") " pod="openstack/ceph" Feb 24 10:28:22 crc kubenswrapper[4698]: I0224 10:28:22.375608 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/cb7a90fa-4b3f-41d0-92de-153dc40310ad-data\") pod \"ceph\" (UID: \"cb7a90fa-4b3f-41d0-92de-153dc40310ad\") " pod="openstack/ceph" Feb 24 10:28:22 crc kubenswrapper[4698]: I0224 10:28:22.375685 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log\" (UniqueName: \"kubernetes.io/empty-dir/cb7a90fa-4b3f-41d0-92de-153dc40310ad-log\") pod \"ceph\" (UID: \"cb7a90fa-4b3f-41d0-92de-153dc40310ad\") " pod="openstack/ceph" Feb 24 10:28:22 crc kubenswrapper[4698]: I0224 10:28:22.375734 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/empty-dir/cb7a90fa-4b3f-41d0-92de-153dc40310ad-run\") pod \"ceph\" (UID: \"cb7a90fa-4b3f-41d0-92de-153dc40310ad\") " pod="openstack/ceph" Feb 24 10:28:22 crc kubenswrapper[4698]: I0224 10:28:22.375764 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snp72\" (UniqueName: \"kubernetes.io/projected/cb7a90fa-4b3f-41d0-92de-153dc40310ad-kube-api-access-snp72\") pod \"ceph\" (UID: \"cb7a90fa-4b3f-41d0-92de-153dc40310ad\") " pod="openstack/ceph" Feb 24 10:28:22 crc kubenswrapper[4698]: I0224 10:28:22.376234 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/empty-dir/cb7a90fa-4b3f-41d0-92de-153dc40310ad-run\") pod \"ceph\" (UID: \"cb7a90fa-4b3f-41d0-92de-153dc40310ad\") " pod="openstack/ceph" Feb 24 10:28:22 crc kubenswrapper[4698]: I0224 10:28:22.376468 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log\" (UniqueName: \"kubernetes.io/empty-dir/cb7a90fa-4b3f-41d0-92de-153dc40310ad-log\") pod \"ceph\" (UID: \"cb7a90fa-4b3f-41d0-92de-153dc40310ad\") " pod="openstack/ceph" Feb 24 10:28:22 crc kubenswrapper[4698]: I0224 10:28:22.376622 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/cb7a90fa-4b3f-41d0-92de-153dc40310ad-data\") pod \"ceph\" (UID: \"cb7a90fa-4b3f-41d0-92de-153dc40310ad\") " pod="openstack/ceph" Feb 24 10:28:22 crc kubenswrapper[4698]: I0224 10:28:22.411374 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snp72\" (UniqueName: \"kubernetes.io/projected/cb7a90fa-4b3f-41d0-92de-153dc40310ad-kube-api-access-snp72\") pod \"ceph\" (UID: \"cb7a90fa-4b3f-41d0-92de-153dc40310ad\") " pod="openstack/ceph" Feb 24 10:28:22 crc kubenswrapper[4698]: I0224 10:28:22.507474 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph" Feb 24 10:28:22 crc kubenswrapper[4698]: W0224 10:28:22.532794 4698 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcb7a90fa_4b3f_41d0_92de_153dc40310ad.slice/crio-ef67222fb9f131c0b2b36154821a0e8ef9f243c509f2be796753da262f1db042 WatchSource:0}: Error finding container ef67222fb9f131c0b2b36154821a0e8ef9f243c509f2be796753da262f1db042: Status 404 returned error can't find the container with id ef67222fb9f131c0b2b36154821a0e8ef9f243c509f2be796753da262f1db042 Feb 24 10:28:22 crc kubenswrapper[4698]: I0224 10:28:22.582323 4698 ???:1] "http: TLS handshake error from 192.168.126.11:35650: no serving certificate available for the kubelet" Feb 24 10:28:22 crc kubenswrapper[4698]: I0224 10:28:22.596037 4698 ???:1] "http: TLS handshake error from 192.168.126.11:35666: no serving certificate available for the kubelet" Feb 24 10:28:23 crc kubenswrapper[4698]: I0224 10:28:23.270410 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph" event={"ID":"cb7a90fa-4b3f-41d0-92de-153dc40310ad","Type":"ContainerStarted","Data":"ef67222fb9f131c0b2b36154821a0e8ef9f243c509f2be796753da262f1db042"} Feb 24 10:28:23 crc kubenswrapper[4698]: I0224 10:28:23.778900 4698 ???:1] "http: TLS handshake error from 192.168.126.11:35680: no serving certificate available for the kubelet" Feb 24 10:28:23 crc kubenswrapper[4698]: I0224 10:28:23.789411 4698 ???:1] "http: TLS handshake error from 192.168.126.11:35696: no serving certificate available for the kubelet" Feb 24 10:28:24 crc kubenswrapper[4698]: I0224 10:28:24.926896 4698 ???:1] "http: TLS handshake error from 192.168.126.11:35702: no serving certificate available for the kubelet" Feb 24 10:28:24 crc kubenswrapper[4698]: I0224 10:28:24.937824 4698 ???:1] "http: TLS handshake error from 192.168.126.11:35708: no serving certificate available for the kubelet" Feb 24 10:28:26 crc kubenswrapper[4698]: I0224 10:28:26.124039 4698 ???:1] "http: TLS handshake error from 192.168.126.11:35716: no serving certificate available for the kubelet" Feb 24 10:28:26 crc kubenswrapper[4698]: I0224 10:28:26.139966 4698 ???:1] "http: TLS handshake error from 192.168.126.11:35724: no serving certificate available for the kubelet" Feb 24 10:28:27 crc kubenswrapper[4698]: I0224 10:28:27.288047 4698 ???:1] "http: TLS handshake error from 192.168.126.11:35730: no serving certificate available for the kubelet" Feb 24 10:28:27 crc kubenswrapper[4698]: I0224 10:28:27.302489 4698 ???:1] "http: TLS handshake error from 192.168.126.11:35732: no serving certificate available for the kubelet" Feb 24 10:28:28 crc kubenswrapper[4698]: I0224 10:28:28.457698 4698 ???:1] "http: TLS handshake error from 192.168.126.11:35734: no serving certificate available for the kubelet" Feb 24 10:28:28 crc kubenswrapper[4698]: I0224 10:28:28.473133 4698 ???:1] "http: TLS handshake error from 192.168.126.11:35736: no serving certificate available for the kubelet" Feb 24 10:28:29 crc kubenswrapper[4698]: I0224 10:28:29.634960 4698 ???:1] "http: TLS handshake error from 192.168.126.11:35752: no serving certificate available for the kubelet" Feb 24 10:28:29 crc kubenswrapper[4698]: I0224 10:28:29.653179 4698 ???:1] "http: TLS handshake error from 192.168.126.11:35756: no serving certificate available for the kubelet" Feb 24 10:28:30 crc kubenswrapper[4698]: I0224 10:28:30.808062 4698 ???:1] "http: TLS handshake error from 192.168.126.11:35770: no serving certificate available for the kubelet" Feb 24 10:28:30 crc kubenswrapper[4698]: I0224 10:28:30.839582 4698 ???:1] "http: TLS handshake error from 192.168.126.11:35786: no serving certificate available for the kubelet" Feb 24 10:28:32 crc kubenswrapper[4698]: I0224 10:28:32.030111 4698 ???:1] "http: TLS handshake error from 192.168.126.11:37256: no serving certificate available for the kubelet" Feb 24 10:28:32 crc kubenswrapper[4698]: I0224 10:28:32.044945 4698 ???:1] "http: TLS handshake error from 192.168.126.11:37260: no serving certificate available for the kubelet" Feb 24 10:28:33 crc kubenswrapper[4698]: I0224 10:28:33.192319 4698 ???:1] "http: TLS handshake error from 192.168.126.11:37262: no serving certificate available for the kubelet" Feb 24 10:28:33 crc kubenswrapper[4698]: I0224 10:28:33.206393 4698 ???:1] "http: TLS handshake error from 192.168.126.11:37264: no serving certificate available for the kubelet" Feb 24 10:28:34 crc kubenswrapper[4698]: I0224 10:28:34.334902 4698 ???:1] "http: TLS handshake error from 192.168.126.11:37268: no serving certificate available for the kubelet" Feb 24 10:28:34 crc kubenswrapper[4698]: I0224 10:28:34.351599 4698 ???:1] "http: TLS handshake error from 192.168.126.11:37284: no serving certificate available for the kubelet" Feb 24 10:28:35 crc kubenswrapper[4698]: I0224 10:28:35.512401 4698 ???:1] "http: TLS handshake error from 192.168.126.11:37298: no serving certificate available for the kubelet" Feb 24 10:28:35 crc kubenswrapper[4698]: I0224 10:28:35.529067 4698 ???:1] "http: TLS handshake error from 192.168.126.11:37300: no serving certificate available for the kubelet" Feb 24 10:28:36 crc kubenswrapper[4698]: I0224 10:28:36.696235 4698 ???:1] "http: TLS handshake error from 192.168.126.11:37312: no serving certificate available for the kubelet" Feb 24 10:28:36 crc kubenswrapper[4698]: I0224 10:28:36.709889 4698 ???:1] "http: TLS handshake error from 192.168.126.11:37322: no serving certificate available for the kubelet" Feb 24 10:28:37 crc kubenswrapper[4698]: I0224 10:28:37.869464 4698 ???:1] "http: TLS handshake error from 192.168.126.11:37332: no serving certificate available for the kubelet" Feb 24 10:28:37 crc kubenswrapper[4698]: I0224 10:28:37.884254 4698 ???:1] "http: TLS handshake error from 192.168.126.11:37342: no serving certificate available for the kubelet" Feb 24 10:28:39 crc kubenswrapper[4698]: I0224 10:28:39.038632 4698 ???:1] "http: TLS handshake error from 192.168.126.11:37356: no serving certificate available for the kubelet" Feb 24 10:28:39 crc kubenswrapper[4698]: I0224 10:28:39.085392 4698 ???:1] "http: TLS handshake error from 192.168.126.11:37358: no serving certificate available for the kubelet" Feb 24 10:28:40 crc kubenswrapper[4698]: I0224 10:28:40.289193 4698 ???:1] "http: TLS handshake error from 192.168.126.11:37366: no serving certificate available for the kubelet" Feb 24 10:28:40 crc kubenswrapper[4698]: I0224 10:28:40.305829 4698 ???:1] "http: TLS handshake error from 192.168.126.11:37370: no serving certificate available for the kubelet" Feb 24 10:28:41 crc kubenswrapper[4698]: I0224 10:28:41.487690 4698 ???:1] "http: TLS handshake error from 192.168.126.11:37374: no serving certificate available for the kubelet" Feb 24 10:28:41 crc kubenswrapper[4698]: I0224 10:28:41.505948 4698 ???:1] "http: TLS handshake error from 192.168.126.11:37388: no serving certificate available for the kubelet" Feb 24 10:28:42 crc kubenswrapper[4698]: I0224 10:28:42.720201 4698 ???:1] "http: TLS handshake error from 192.168.126.11:42556: no serving certificate available for the kubelet" Feb 24 10:28:42 crc kubenswrapper[4698]: I0224 10:28:42.738307 4698 ???:1] "http: TLS handshake error from 192.168.126.11:42558: no serving certificate available for the kubelet" Feb 24 10:28:43 crc kubenswrapper[4698]: I0224 10:28:43.959835 4698 ???:1] "http: TLS handshake error from 192.168.126.11:42564: no serving certificate available for the kubelet" Feb 24 10:28:43 crc kubenswrapper[4698]: I0224 10:28:43.975316 4698 ???:1] "http: TLS handshake error from 192.168.126.11:42570: no serving certificate available for the kubelet" Feb 24 10:28:45 crc kubenswrapper[4698]: I0224 10:28:45.123714 4698 ???:1] "http: TLS handshake error from 192.168.126.11:42580: no serving certificate available for the kubelet" Feb 24 10:28:45 crc kubenswrapper[4698]: I0224 10:28:45.143156 4698 ???:1] "http: TLS handshake error from 192.168.126.11:42586: no serving certificate available for the kubelet" Feb 24 10:28:46 crc kubenswrapper[4698]: E0224 10:28:46.212493 4698 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/ceph/demo:latest-squid" Feb 24 10:28:46 crc kubenswrapper[4698]: E0224 10:28:46.212762 4698 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceph,Image:quay.io/ceph/demo:latest-squid,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:MON_IP,Value:192.168.126.11,ValueFrom:nil,},EnvVar{Name:CEPH_DAEMON,Value:demo,ValueFrom:nil,},EnvVar{Name:CEPH_PUBLIC_NETWORK,Value:0.0.0.0/0,ValueFrom:nil,},EnvVar{Name:DEMO_DAEMONS,Value:osd,mds,rgw,ValueFrom:nil,},EnvVar{Name:CEPH_DEMO_UID,Value:0,ValueFrom:nil,},EnvVar{Name:RGW_NAME,Value:ceph,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:data,ReadOnly:false,MountPath:/var/lib/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:log,ReadOnly:false,MountPath:/var/log/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:run,ReadOnly:false,MountPath:/run/ceph,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-snp72,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceph_openstack(cb7a90fa-4b3f-41d0-92de-153dc40310ad): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 24 10:28:46 crc kubenswrapper[4698]: E0224 10:28:46.213954 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceph\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceph" podUID="cb7a90fa-4b3f-41d0-92de-153dc40310ad" Feb 24 10:28:46 crc kubenswrapper[4698]: I0224 10:28:46.323462 4698 ???:1] "http: TLS handshake error from 192.168.126.11:42594: no serving certificate available for the kubelet" Feb 24 10:28:46 crc kubenswrapper[4698]: I0224 10:28:46.340566 4698 ???:1] "http: TLS handshake error from 192.168.126.11:42604: no serving certificate available for the kubelet" Feb 24 10:28:46 crc kubenswrapper[4698]: E0224 10:28:46.422614 4698 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceph\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/ceph/demo:latest-squid\\\"\"" pod="openstack/ceph" podUID="cb7a90fa-4b3f-41d0-92de-153dc40310ad" Feb 24 10:28:47 crc kubenswrapper[4698]: I0224 10:28:47.512342 4698 ???:1] "http: TLS handshake error from 192.168.126.11:42618: no serving certificate available for the kubelet" Feb 24 10:28:47 crc kubenswrapper[4698]: I0224 10:28:47.529231 4698 ???:1] "http: TLS handshake error from 192.168.126.11:42624: no serving certificate available for the kubelet" Feb 24 10:28:48 crc kubenswrapper[4698]: I0224 10:28:48.717013 4698 ???:1] "http: TLS handshake error from 192.168.126.11:42628: no serving certificate available for the kubelet" Feb 24 10:28:48 crc kubenswrapper[4698]: I0224 10:28:48.734775 4698 ???:1] "http: TLS handshake error from 192.168.126.11:42634: no serving certificate available for the kubelet" Feb 24 10:28:49 crc kubenswrapper[4698]: I0224 10:28:49.888213 4698 ???:1] "http: TLS handshake error from 192.168.126.11:42636: no serving certificate available for the kubelet" Feb 24 10:28:49 crc kubenswrapper[4698]: I0224 10:28:49.903341 4698 ???:1] "http: TLS handshake error from 192.168.126.11:42644: no serving certificate available for the kubelet" Feb 24 10:28:51 crc kubenswrapper[4698]: I0224 10:28:51.066328 4698 ???:1] "http: TLS handshake error from 192.168.126.11:42660: no serving certificate available for the kubelet" Feb 24 10:28:51 crc kubenswrapper[4698]: I0224 10:28:51.082764 4698 ???:1] "http: TLS handshake error from 192.168.126.11:42672: no serving certificate available for the kubelet" Feb 24 10:28:52 crc kubenswrapper[4698]: I0224 10:28:52.296252 4698 ???:1] "http: TLS handshake error from 192.168.126.11:56016: no serving certificate available for the kubelet" Feb 24 10:28:52 crc kubenswrapper[4698]: I0224 10:28:52.313405 4698 ???:1] "http: TLS handshake error from 192.168.126.11:56024: no serving certificate available for the kubelet" Feb 24 10:28:53 crc kubenswrapper[4698]: I0224 10:28:53.517129 4698 ???:1] "http: TLS handshake error from 192.168.126.11:56032: no serving certificate available for the kubelet" Feb 24 10:28:53 crc kubenswrapper[4698]: I0224 10:28:53.531184 4698 ???:1] "http: TLS handshake error from 192.168.126.11:56042: no serving certificate available for the kubelet" Feb 24 10:28:54 crc kubenswrapper[4698]: I0224 10:28:54.715471 4698 ???:1] "http: TLS handshake error from 192.168.126.11:56052: no serving certificate available for the kubelet" Feb 24 10:28:54 crc kubenswrapper[4698]: I0224 10:28:54.732248 4698 ???:1] "http: TLS handshake error from 192.168.126.11:56064: no serving certificate available for the kubelet" Feb 24 10:28:55 crc kubenswrapper[4698]: I0224 10:28:55.923595 4698 ???:1] "http: TLS handshake error from 192.168.126.11:56080: no serving certificate available for the kubelet" Feb 24 10:28:55 crc kubenswrapper[4698]: I0224 10:28:55.944376 4698 ???:1] "http: TLS handshake error from 192.168.126.11:56082: no serving certificate available for the kubelet" Feb 24 10:28:57 crc kubenswrapper[4698]: I0224 10:28:57.126885 4698 ???:1] "http: TLS handshake error from 192.168.126.11:56098: no serving certificate available for the kubelet" Feb 24 10:28:57 crc kubenswrapper[4698]: I0224 10:28:57.144394 4698 ???:1] "http: TLS handshake error from 192.168.126.11:56100: no serving certificate available for the kubelet" Feb 24 10:28:58 crc kubenswrapper[4698]: I0224 10:28:58.319811 4698 ???:1] "http: TLS handshake error from 192.168.126.11:56102: no serving certificate available for the kubelet" Feb 24 10:28:58 crc kubenswrapper[4698]: I0224 10:28:58.346085 4698 ???:1] "http: TLS handshake error from 192.168.126.11:56104: no serving certificate available for the kubelet" Feb 24 10:28:59 crc kubenswrapper[4698]: I0224 10:28:59.509994 4698 ???:1] "http: TLS handshake error from 192.168.126.11:56106: no serving certificate available for the kubelet" Feb 24 10:28:59 crc kubenswrapper[4698]: I0224 10:28:59.526514 4698 ???:1] "http: TLS handshake error from 192.168.126.11:56116: no serving certificate available for the kubelet" Feb 24 10:29:00 crc kubenswrapper[4698]: I0224 10:29:00.512064 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph" event={"ID":"cb7a90fa-4b3f-41d0-92de-153dc40310ad","Type":"ContainerStarted","Data":"2711f654c1793029526e44b3602037e1acef1baf5e4d8cc66da7f8fb7f57106a"} Feb 24 10:29:00 crc kubenswrapper[4698]: I0224 10:29:00.544567 4698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph" podStartSLOduration=1.30211737 podStartE2EDuration="38.544539779s" podCreationTimestamp="2026-02-24 10:28:22 +0000 UTC" firstStartedPulling="2026-02-24 10:28:22.536378923 +0000 UTC m=+727.649993184" lastFinishedPulling="2026-02-24 10:28:59.778801352 +0000 UTC m=+764.892415593" observedRunningTime="2026-02-24 10:29:00.54208438 +0000 UTC m=+765.655698651" watchObservedRunningTime="2026-02-24 10:29:00.544539779 +0000 UTC m=+765.658154050" Feb 24 10:29:00 crc kubenswrapper[4698]: I0224 10:29:00.669540 4698 ???:1] "http: TLS handshake error from 192.168.126.11:56120: no serving certificate available for the kubelet" Feb 24 10:29:00 crc kubenswrapper[4698]: I0224 10:29:00.686147 4698 ???:1] "http: TLS handshake error from 192.168.126.11:56136: no serving certificate available for the kubelet" Feb 24 10:29:01 crc kubenswrapper[4698]: I0224 10:29:01.937757 4698 ???:1] "http: TLS handshake error from 192.168.126.11:37794: no serving certificate available for the kubelet" Feb 24 10:29:01 crc kubenswrapper[4698]: I0224 10:29:01.950120 4698 ???:1] "http: TLS handshake error from 192.168.126.11:37810: no serving certificate available for the kubelet" Feb 24 10:29:03 crc kubenswrapper[4698]: I0224 10:29:03.118010 4698 ???:1] "http: TLS handshake error from 192.168.126.11:37824: no serving certificate available for the kubelet" Feb 24 10:29:03 crc kubenswrapper[4698]: I0224 10:29:03.135888 4698 ???:1] "http: TLS handshake error from 192.168.126.11:37836: no serving certificate available for the kubelet" Feb 24 10:29:04 crc kubenswrapper[4698]: I0224 10:29:04.289963 4698 ???:1] "http: TLS handshake error from 192.168.126.11:37850: no serving certificate available for the kubelet" Feb 24 10:29:04 crc kubenswrapper[4698]: I0224 10:29:04.302941 4698 ???:1] "http: TLS handshake error from 192.168.126.11:37852: no serving certificate available for the kubelet" Feb 24 10:29:05 crc kubenswrapper[4698]: I0224 10:29:05.442414 4698 ???:1] "http: TLS handshake error from 192.168.126.11:37868: no serving certificate available for the kubelet" Feb 24 10:29:05 crc kubenswrapper[4698]: I0224 10:29:05.455818 4698 ???:1] "http: TLS handshake error from 192.168.126.11:37876: no serving certificate available for the kubelet" Feb 24 10:29:06 crc kubenswrapper[4698]: I0224 10:29:06.649153 4698 ???:1] "http: TLS handshake error from 192.168.126.11:37882: no serving certificate available for the kubelet" Feb 24 10:29:06 crc kubenswrapper[4698]: I0224 10:29:06.670005 4698 ???:1] "http: TLS handshake error from 192.168.126.11:37886: no serving certificate available for the kubelet" Feb 24 10:29:07 crc kubenswrapper[4698]: I0224 10:29:07.869705 4698 ???:1] "http: TLS handshake error from 192.168.126.11:37888: no serving certificate available for the kubelet" Feb 24 10:29:07 crc kubenswrapper[4698]: I0224 10:29:07.887406 4698 ???:1] "http: TLS handshake error from 192.168.126.11:37902: no serving certificate available for the kubelet" Feb 24 10:29:09 crc kubenswrapper[4698]: I0224 10:29:09.024599 4698 ???:1] "http: TLS handshake error from 192.168.126.11:37906: no serving certificate available for the kubelet" Feb 24 10:29:09 crc kubenswrapper[4698]: I0224 10:29:09.039108 4698 ???:1] "http: TLS handshake error from 192.168.126.11:37918: no serving certificate available for the kubelet" Feb 24 10:29:10 crc kubenswrapper[4698]: I0224 10:29:10.200816 4698 ???:1] "http: TLS handshake error from 192.168.126.11:37922: no serving certificate available for the kubelet" Feb 24 10:29:10 crc kubenswrapper[4698]: I0224 10:29:10.214973 4698 ???:1] "http: TLS handshake error from 192.168.126.11:37932: no serving certificate available for the kubelet" Feb 24 10:29:11 crc kubenswrapper[4698]: I0224 10:29:11.409779 4698 ???:1] "http: TLS handshake error from 192.168.126.11:37934: no serving certificate available for the kubelet" Feb 24 10:29:11 crc kubenswrapper[4698]: I0224 10:29:11.424967 4698 ???:1] "http: TLS handshake error from 192.168.126.11:37942: no serving certificate available for the kubelet" Feb 24 10:29:12 crc kubenswrapper[4698]: I0224 10:29:12.553982 4698 ???:1] "http: TLS handshake error from 192.168.126.11:46706: no serving certificate available for the kubelet" Feb 24 10:29:12 crc kubenswrapper[4698]: I0224 10:29:12.572586 4698 ???:1] "http: TLS handshake error from 192.168.126.11:46722: no serving certificate available for the kubelet" Feb 24 10:29:13 crc kubenswrapper[4698]: I0224 10:29:13.741603 4698 ???:1] "http: TLS handshake error from 192.168.126.11:46726: no serving certificate available for the kubelet" Feb 24 10:29:13 crc kubenswrapper[4698]: I0224 10:29:13.756168 4698 ???:1] "http: TLS handshake error from 192.168.126.11:46728: no serving certificate available for the kubelet" Feb 24 10:29:14 crc kubenswrapper[4698]: I0224 10:29:14.892510 4698 ???:1] "http: TLS handshake error from 192.168.126.11:46744: no serving certificate available for the kubelet" Feb 24 10:29:14 crc kubenswrapper[4698]: I0224 10:29:14.910329 4698 ???:1] "http: TLS handshake error from 192.168.126.11:46756: no serving certificate available for the kubelet" Feb 24 10:29:16 crc kubenswrapper[4698]: I0224 10:29:16.055331 4698 ???:1] "http: TLS handshake error from 192.168.126.11:46758: no serving certificate available for the kubelet" Feb 24 10:29:16 crc kubenswrapper[4698]: I0224 10:29:16.071188 4698 ???:1] "http: TLS handshake error from 192.168.126.11:46770: no serving certificate available for the kubelet" Feb 24 10:29:17 crc kubenswrapper[4698]: I0224 10:29:17.256027 4698 ???:1] "http: TLS handshake error from 192.168.126.11:46776: no serving certificate available for the kubelet" Feb 24 10:29:17 crc kubenswrapper[4698]: I0224 10:29:17.271559 4698 ???:1] "http: TLS handshake error from 192.168.126.11:46778: no serving certificate available for the kubelet" Feb 24 10:29:18 crc kubenswrapper[4698]: I0224 10:29:18.477128 4698 ???:1] "http: TLS handshake error from 192.168.126.11:46790: no serving certificate available for the kubelet" Feb 24 10:29:18 crc kubenswrapper[4698]: I0224 10:29:18.494641 4698 ???:1] "http: TLS handshake error from 192.168.126.11:46806: no serving certificate available for the kubelet" Feb 24 10:29:19 crc kubenswrapper[4698]: I0224 10:29:19.637621 4698 ???:1] "http: TLS handshake error from 192.168.126.11:46818: no serving certificate available for the kubelet" Feb 24 10:29:19 crc kubenswrapper[4698]: I0224 10:29:19.655965 4698 ???:1] "http: TLS handshake error from 192.168.126.11:46832: no serving certificate available for the kubelet" Feb 24 10:29:20 crc kubenswrapper[4698]: I0224 10:29:20.798865 4698 ???:1] "http: TLS handshake error from 192.168.126.11:46846: no serving certificate available for the kubelet" Feb 24 10:29:20 crc kubenswrapper[4698]: I0224 10:29:20.819349 4698 ???:1] "http: TLS handshake error from 192.168.126.11:46852: no serving certificate available for the kubelet" Feb 24 10:29:21 crc kubenswrapper[4698]: I0224 10:29:21.988818 4698 ???:1] "http: TLS handshake error from 192.168.126.11:39206: no serving certificate available for the kubelet" Feb 24 10:29:22 crc kubenswrapper[4698]: I0224 10:29:22.002214 4698 ???:1] "http: TLS handshake error from 192.168.126.11:39218: no serving certificate available for the kubelet" Feb 24 10:29:22 crc kubenswrapper[4698]: I0224 10:29:22.196821 4698 patch_prober.go:28] interesting pod/machine-config-daemon-nn578 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 10:29:22 crc kubenswrapper[4698]: I0224 10:29:22.196897 4698 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nn578" podUID="b4ee0bb1-125d-4852-a54d-7dadf6177545" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 10:29:23 crc kubenswrapper[4698]: I0224 10:29:23.135426 4698 ???:1] "http: TLS handshake error from 192.168.126.11:39234: no serving certificate available for the kubelet" Feb 24 10:29:23 crc kubenswrapper[4698]: I0224 10:29:23.149713 4698 ???:1] "http: TLS handshake error from 192.168.126.11:39240: no serving certificate available for the kubelet" Feb 24 10:29:24 crc kubenswrapper[4698]: I0224 10:29:24.299044 4698 ???:1] "http: TLS handshake error from 192.168.126.11:39246: no serving certificate available for the kubelet" Feb 24 10:29:24 crc kubenswrapper[4698]: I0224 10:29:24.313748 4698 ???:1] "http: TLS handshake error from 192.168.126.11:39252: no serving certificate available for the kubelet" Feb 24 10:29:25 crc kubenswrapper[4698]: I0224 10:29:25.476350 4698 ???:1] "http: TLS handshake error from 192.168.126.11:39254: no serving certificate available for the kubelet" Feb 24 10:29:25 crc kubenswrapper[4698]: I0224 10:29:25.492912 4698 ???:1] "http: TLS handshake error from 192.168.126.11:39260: no serving certificate available for the kubelet" Feb 24 10:29:26 crc kubenswrapper[4698]: I0224 10:29:26.630134 4698 ???:1] "http: TLS handshake error from 192.168.126.11:39276: no serving certificate available for the kubelet" Feb 24 10:29:26 crc kubenswrapper[4698]: I0224 10:29:26.642626 4698 ???:1] "http: TLS handshake error from 192.168.126.11:39278: no serving certificate available for the kubelet" Feb 24 10:29:27 crc kubenswrapper[4698]: I0224 10:29:27.827060 4698 ???:1] "http: TLS handshake error from 192.168.126.11:39286: no serving certificate available for the kubelet" Feb 24 10:29:27 crc kubenswrapper[4698]: I0224 10:29:27.841155 4698 ???:1] "http: TLS handshake error from 192.168.126.11:39300: no serving certificate available for the kubelet" Feb 24 10:29:28 crc kubenswrapper[4698]: I0224 10:29:28.981361 4698 ???:1] "http: TLS handshake error from 192.168.126.11:39312: no serving certificate available for the kubelet" Feb 24 10:29:28 crc kubenswrapper[4698]: I0224 10:29:28.995420 4698 ???:1] "http: TLS handshake error from 192.168.126.11:39328: no serving certificate available for the kubelet" Feb 24 10:29:30 crc kubenswrapper[4698]: I0224 10:29:30.155093 4698 ???:1] "http: TLS handshake error from 192.168.126.11:39336: no serving certificate available for the kubelet" Feb 24 10:29:30 crc kubenswrapper[4698]: I0224 10:29:30.172299 4698 ???:1] "http: TLS handshake error from 192.168.126.11:39346: no serving certificate available for the kubelet" Feb 24 10:29:31 crc kubenswrapper[4698]: I0224 10:29:31.327659 4698 ???:1] "http: TLS handshake error from 192.168.126.11:39354: no serving certificate available for the kubelet" Feb 24 10:29:31 crc kubenswrapper[4698]: I0224 10:29:31.341912 4698 ???:1] "http: TLS handshake error from 192.168.126.11:39360: no serving certificate available for the kubelet" Feb 24 10:29:32 crc kubenswrapper[4698]: I0224 10:29:32.472755 4698 ???:1] "http: TLS handshake error from 192.168.126.11:45492: no serving certificate available for the kubelet" Feb 24 10:29:32 crc kubenswrapper[4698]: I0224 10:29:32.486954 4698 ???:1] "http: TLS handshake error from 192.168.126.11:45494: no serving certificate available for the kubelet" Feb 24 10:29:33 crc kubenswrapper[4698]: I0224 10:29:33.661874 4698 ???:1] "http: TLS handshake error from 192.168.126.11:45496: no serving certificate available for the kubelet" Feb 24 10:29:33 crc kubenswrapper[4698]: I0224 10:29:33.679620 4698 ???:1] "http: TLS handshake error from 192.168.126.11:45504: no serving certificate available for the kubelet" Feb 24 10:29:34 crc kubenswrapper[4698]: I0224 10:29:34.858000 4698 ???:1] "http: TLS handshake error from 192.168.126.11:45508: no serving certificate available for the kubelet" Feb 24 10:29:34 crc kubenswrapper[4698]: I0224 10:29:34.875391 4698 ???:1] "http: TLS handshake error from 192.168.126.11:45516: no serving certificate available for the kubelet" Feb 24 10:29:36 crc kubenswrapper[4698]: I0224 10:29:36.072193 4698 ???:1] "http: TLS handshake error from 192.168.126.11:45528: no serving certificate available for the kubelet" Feb 24 10:29:36 crc kubenswrapper[4698]: I0224 10:29:36.087494 4698 ???:1] "http: TLS handshake error from 192.168.126.11:45536: no serving certificate available for the kubelet" Feb 24 10:29:37 crc kubenswrapper[4698]: I0224 10:29:37.270374 4698 ???:1] "http: TLS handshake error from 192.168.126.11:45552: no serving certificate available for the kubelet" Feb 24 10:29:37 crc kubenswrapper[4698]: I0224 10:29:37.290056 4698 ???:1] "http: TLS handshake error from 192.168.126.11:45562: no serving certificate available for the kubelet" Feb 24 10:29:38 crc kubenswrapper[4698]: I0224 10:29:38.499878 4698 ???:1] "http: TLS handshake error from 192.168.126.11:45578: no serving certificate available for the kubelet" Feb 24 10:29:38 crc kubenswrapper[4698]: I0224 10:29:38.515671 4698 ???:1] "http: TLS handshake error from 192.168.126.11:45580: no serving certificate available for the kubelet" Feb 24 10:29:39 crc kubenswrapper[4698]: I0224 10:29:39.751442 4698 ???:1] "http: TLS handshake error from 192.168.126.11:45586: no serving certificate available for the kubelet" Feb 24 10:29:39 crc kubenswrapper[4698]: I0224 10:29:39.769253 4698 ???:1] "http: TLS handshake error from 192.168.126.11:45592: no serving certificate available for the kubelet" Feb 24 10:29:40 crc kubenswrapper[4698]: I0224 10:29:40.942169 4698 ???:1] "http: TLS handshake error from 192.168.126.11:45600: no serving certificate available for the kubelet" Feb 24 10:29:40 crc kubenswrapper[4698]: I0224 10:29:40.956712 4698 ???:1] "http: TLS handshake error from 192.168.126.11:45604: no serving certificate available for the kubelet" Feb 24 10:29:42 crc kubenswrapper[4698]: I0224 10:29:42.109852 4698 ???:1] "http: TLS handshake error from 192.168.126.11:58536: no serving certificate available for the kubelet" Feb 24 10:29:42 crc kubenswrapper[4698]: I0224 10:29:42.123796 4698 ???:1] "http: TLS handshake error from 192.168.126.11:58552: no serving certificate available for the kubelet" Feb 24 10:29:43 crc kubenswrapper[4698]: I0224 10:29:43.315306 4698 ???:1] "http: TLS handshake error from 192.168.126.11:58554: no serving certificate available for the kubelet" Feb 24 10:29:43 crc kubenswrapper[4698]: I0224 10:29:43.357950 4698 ???:1] "http: TLS handshake error from 192.168.126.11:58562: no serving certificate available for the kubelet" Feb 24 10:29:44 crc kubenswrapper[4698]: I0224 10:29:44.529737 4698 ???:1] "http: TLS handshake error from 192.168.126.11:58576: no serving certificate available for the kubelet" Feb 24 10:29:44 crc kubenswrapper[4698]: I0224 10:29:44.542601 4698 ???:1] "http: TLS handshake error from 192.168.126.11:58588: no serving certificate available for the kubelet" Feb 24 10:29:45 crc kubenswrapper[4698]: I0224 10:29:45.677723 4698 ???:1] "http: TLS handshake error from 192.168.126.11:58596: no serving certificate available for the kubelet" Feb 24 10:29:45 crc kubenswrapper[4698]: I0224 10:29:45.689243 4698 ???:1] "http: TLS handshake error from 192.168.126.11:58608: no serving certificate available for the kubelet" Feb 24 10:29:46 crc kubenswrapper[4698]: I0224 10:29:46.835221 4698 ???:1] "http: TLS handshake error from 192.168.126.11:58616: no serving certificate available for the kubelet" Feb 24 10:29:46 crc kubenswrapper[4698]: I0224 10:29:46.853140 4698 ???:1] "http: TLS handshake error from 192.168.126.11:58620: no serving certificate available for the kubelet" Feb 24 10:29:47 crc kubenswrapper[4698]: I0224 10:29:47.991040 4698 ???:1] "http: TLS handshake error from 192.168.126.11:58636: no serving certificate available for the kubelet" Feb 24 10:29:48 crc kubenswrapper[4698]: I0224 10:29:48.005667 4698 ???:1] "http: TLS handshake error from 192.168.126.11:58648: no serving certificate available for the kubelet" Feb 24 10:29:49 crc kubenswrapper[4698]: I0224 10:29:49.155808 4698 ???:1] "http: TLS handshake error from 192.168.126.11:58658: no serving certificate available for the kubelet" Feb 24 10:29:49 crc kubenswrapper[4698]: I0224 10:29:49.170217 4698 ???:1] "http: TLS handshake error from 192.168.126.11:58660: no serving certificate available for the kubelet" Feb 24 10:29:50 crc kubenswrapper[4698]: I0224 10:29:50.369635 4698 ???:1] "http: TLS handshake error from 192.168.126.11:58674: no serving certificate available for the kubelet" Feb 24 10:29:50 crc kubenswrapper[4698]: I0224 10:29:50.388514 4698 ???:1] "http: TLS handshake error from 192.168.126.11:58686: no serving certificate available for the kubelet" Feb 24 10:29:51 crc kubenswrapper[4698]: I0224 10:29:51.555913 4698 ???:1] "http: TLS handshake error from 192.168.126.11:54966: no serving certificate available for the kubelet" Feb 24 10:29:51 crc kubenswrapper[4698]: I0224 10:29:51.568741 4698 ???:1] "http: TLS handshake error from 192.168.126.11:54982: no serving certificate available for the kubelet" Feb 24 10:29:52 crc kubenswrapper[4698]: I0224 10:29:52.196915 4698 patch_prober.go:28] interesting pod/machine-config-daemon-nn578 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 10:29:52 crc kubenswrapper[4698]: I0224 10:29:52.197022 4698 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nn578" podUID="b4ee0bb1-125d-4852-a54d-7dadf6177545" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 10:29:52 crc kubenswrapper[4698]: I0224 10:29:52.753658 4698 ???:1] "http: TLS handshake error from 192.168.126.11:54984: no serving certificate available for the kubelet" Feb 24 10:29:52 crc kubenswrapper[4698]: I0224 10:29:52.772423 4698 ???:1] "http: TLS handshake error from 192.168.126.11:54996: no serving certificate available for the kubelet" Feb 24 10:29:53 crc kubenswrapper[4698]: I0224 10:29:53.907949 4698 ???:1] "http: TLS handshake error from 192.168.126.11:55008: no serving certificate available for the kubelet" Feb 24 10:29:53 crc kubenswrapper[4698]: I0224 10:29:53.919390 4698 ???:1] "http: TLS handshake error from 192.168.126.11:55014: no serving certificate available for the kubelet" Feb 24 10:29:55 crc kubenswrapper[4698]: I0224 10:29:55.080383 4698 ???:1] "http: TLS handshake error from 192.168.126.11:55028: no serving certificate available for the kubelet" Feb 24 10:29:55 crc kubenswrapper[4698]: I0224 10:29:55.098924 4698 ???:1] "http: TLS handshake error from 192.168.126.11:55030: no serving certificate available for the kubelet" Feb 24 10:29:56 crc kubenswrapper[4698]: I0224 10:29:56.321345 4698 ???:1] "http: TLS handshake error from 192.168.126.11:55038: no serving certificate available for the kubelet" Feb 24 10:29:56 crc kubenswrapper[4698]: I0224 10:29:56.337675 4698 ???:1] "http: TLS handshake error from 192.168.126.11:55042: no serving certificate available for the kubelet" Feb 24 10:29:57 crc kubenswrapper[4698]: I0224 10:29:57.551438 4698 ???:1] "http: TLS handshake error from 192.168.126.11:55050: no serving certificate available for the kubelet" Feb 24 10:29:57 crc kubenswrapper[4698]: I0224 10:29:57.569836 4698 ???:1] "http: TLS handshake error from 192.168.126.11:55052: no serving certificate available for the kubelet" Feb 24 10:29:58 crc kubenswrapper[4698]: I0224 10:29:58.720468 4698 ???:1] "http: TLS handshake error from 192.168.126.11:55060: no serving certificate available for the kubelet" Feb 24 10:29:58 crc kubenswrapper[4698]: I0224 10:29:58.740226 4698 ???:1] "http: TLS handshake error from 192.168.126.11:55068: no serving certificate available for the kubelet" Feb 24 10:29:59 crc kubenswrapper[4698]: I0224 10:29:59.872032 4698 ???:1] "http: TLS handshake error from 192.168.126.11:55076: no serving certificate available for the kubelet" Feb 24 10:29:59 crc kubenswrapper[4698]: I0224 10:29:59.891370 4698 ???:1] "http: TLS handshake error from 192.168.126.11:55090: no serving certificate available for the kubelet" Feb 24 10:30:00 crc kubenswrapper[4698]: I0224 10:30:00.206222 4698 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29532150-ftznz"] Feb 24 10:30:00 crc kubenswrapper[4698]: I0224 10:30:00.207312 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29532150-ftznz" Feb 24 10:30:00 crc kubenswrapper[4698]: I0224 10:30:00.209858 4698 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 24 10:30:00 crc kubenswrapper[4698]: I0224 10:30:00.212775 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 24 10:30:00 crc kubenswrapper[4698]: I0224 10:30:00.221683 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29532150-ftznz"] Feb 24 10:30:00 crc kubenswrapper[4698]: I0224 10:30:00.251434 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/97ae4aed-b05e-41bb-9ff6-16d6eb3c78b8-config-volume\") pod \"collect-profiles-29532150-ftznz\" (UID: \"97ae4aed-b05e-41bb-9ff6-16d6eb3c78b8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29532150-ftznz" Feb 24 10:30:00 crc kubenswrapper[4698]: I0224 10:30:00.251522 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/97ae4aed-b05e-41bb-9ff6-16d6eb3c78b8-secret-volume\") pod \"collect-profiles-29532150-ftznz\" (UID: \"97ae4aed-b05e-41bb-9ff6-16d6eb3c78b8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29532150-ftznz" Feb 24 10:30:00 crc kubenswrapper[4698]: I0224 10:30:00.251692 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rnww\" (UniqueName: \"kubernetes.io/projected/97ae4aed-b05e-41bb-9ff6-16d6eb3c78b8-kube-api-access-8rnww\") pod \"collect-profiles-29532150-ftznz\" (UID: \"97ae4aed-b05e-41bb-9ff6-16d6eb3c78b8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29532150-ftznz" Feb 24 10:30:00 crc kubenswrapper[4698]: I0224 10:30:00.356416 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rnww\" (UniqueName: \"kubernetes.io/projected/97ae4aed-b05e-41bb-9ff6-16d6eb3c78b8-kube-api-access-8rnww\") pod \"collect-profiles-29532150-ftznz\" (UID: \"97ae4aed-b05e-41bb-9ff6-16d6eb3c78b8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29532150-ftznz" Feb 24 10:30:00 crc kubenswrapper[4698]: I0224 10:30:00.356497 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/97ae4aed-b05e-41bb-9ff6-16d6eb3c78b8-config-volume\") pod \"collect-profiles-29532150-ftznz\" (UID: \"97ae4aed-b05e-41bb-9ff6-16d6eb3c78b8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29532150-ftznz" Feb 24 10:30:00 crc kubenswrapper[4698]: I0224 10:30:00.356554 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/97ae4aed-b05e-41bb-9ff6-16d6eb3c78b8-secret-volume\") pod \"collect-profiles-29532150-ftznz\" (UID: \"97ae4aed-b05e-41bb-9ff6-16d6eb3c78b8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29532150-ftznz" Feb 24 10:30:00 crc kubenswrapper[4698]: I0224 10:30:00.359885 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/97ae4aed-b05e-41bb-9ff6-16d6eb3c78b8-config-volume\") pod \"collect-profiles-29532150-ftznz\" (UID: \"97ae4aed-b05e-41bb-9ff6-16d6eb3c78b8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29532150-ftznz" Feb 24 10:30:00 crc kubenswrapper[4698]: I0224 10:30:00.372372 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/97ae4aed-b05e-41bb-9ff6-16d6eb3c78b8-secret-volume\") pod \"collect-profiles-29532150-ftznz\" (UID: \"97ae4aed-b05e-41bb-9ff6-16d6eb3c78b8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29532150-ftznz" Feb 24 10:30:00 crc kubenswrapper[4698]: I0224 10:30:00.381444 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rnww\" (UniqueName: \"kubernetes.io/projected/97ae4aed-b05e-41bb-9ff6-16d6eb3c78b8-kube-api-access-8rnww\") pod \"collect-profiles-29532150-ftznz\" (UID: \"97ae4aed-b05e-41bb-9ff6-16d6eb3c78b8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29532150-ftznz" Feb 24 10:30:00 crc kubenswrapper[4698]: I0224 10:30:00.536408 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29532150-ftznz" Feb 24 10:30:00 crc kubenswrapper[4698]: I0224 10:30:00.740963 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29532150-ftznz"] Feb 24 10:30:01 crc kubenswrapper[4698]: I0224 10:30:01.030084 4698 ???:1] "http: TLS handshake error from 192.168.126.11:55100: no serving certificate available for the kubelet" Feb 24 10:30:01 crc kubenswrapper[4698]: I0224 10:30:01.048861 4698 ???:1] "http: TLS handshake error from 192.168.126.11:55114: no serving certificate available for the kubelet" Feb 24 10:30:01 crc kubenswrapper[4698]: I0224 10:30:01.368513 4698 generic.go:334] "Generic (PLEG): container finished" podID="97ae4aed-b05e-41bb-9ff6-16d6eb3c78b8" containerID="7efcc978a6040c4acb5c249221b6c304f4dfaf303c68c772b0a03e6e45f223c6" exitCode=0 Feb 24 10:30:01 crc kubenswrapper[4698]: I0224 10:30:01.368870 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29532150-ftznz" event={"ID":"97ae4aed-b05e-41bb-9ff6-16d6eb3c78b8","Type":"ContainerDied","Data":"7efcc978a6040c4acb5c249221b6c304f4dfaf303c68c772b0a03e6e45f223c6"} Feb 24 10:30:01 crc kubenswrapper[4698]: I0224 10:30:01.369099 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29532150-ftznz" event={"ID":"97ae4aed-b05e-41bb-9ff6-16d6eb3c78b8","Type":"ContainerStarted","Data":"2a6039787cf7952065afe3797ce7aff3b3a4cc50b38d0cbf13a221e77c9781f0"} Feb 24 10:30:02 crc kubenswrapper[4698]: I0224 10:30:02.716434 4698 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29532150-ftznz" Feb 24 10:30:02 crc kubenswrapper[4698]: I0224 10:30:02.801237 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/97ae4aed-b05e-41bb-9ff6-16d6eb3c78b8-secret-volume\") pod \"97ae4aed-b05e-41bb-9ff6-16d6eb3c78b8\" (UID: \"97ae4aed-b05e-41bb-9ff6-16d6eb3c78b8\") " Feb 24 10:30:02 crc kubenswrapper[4698]: I0224 10:30:02.801357 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/97ae4aed-b05e-41bb-9ff6-16d6eb3c78b8-config-volume\") pod \"97ae4aed-b05e-41bb-9ff6-16d6eb3c78b8\" (UID: \"97ae4aed-b05e-41bb-9ff6-16d6eb3c78b8\") " Feb 24 10:30:02 crc kubenswrapper[4698]: I0224 10:30:02.801422 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8rnww\" (UniqueName: \"kubernetes.io/projected/97ae4aed-b05e-41bb-9ff6-16d6eb3c78b8-kube-api-access-8rnww\") pod \"97ae4aed-b05e-41bb-9ff6-16d6eb3c78b8\" (UID: \"97ae4aed-b05e-41bb-9ff6-16d6eb3c78b8\") " Feb 24 10:30:02 crc kubenswrapper[4698]: I0224 10:30:02.802413 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97ae4aed-b05e-41bb-9ff6-16d6eb3c78b8-config-volume" (OuterVolumeSpecName: "config-volume") pod "97ae4aed-b05e-41bb-9ff6-16d6eb3c78b8" (UID: "97ae4aed-b05e-41bb-9ff6-16d6eb3c78b8"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 24 10:30:02 crc kubenswrapper[4698]: I0224 10:30:02.807769 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97ae4aed-b05e-41bb-9ff6-16d6eb3c78b8-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "97ae4aed-b05e-41bb-9ff6-16d6eb3c78b8" (UID: "97ae4aed-b05e-41bb-9ff6-16d6eb3c78b8"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 24 10:30:02 crc kubenswrapper[4698]: I0224 10:30:02.808248 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97ae4aed-b05e-41bb-9ff6-16d6eb3c78b8-kube-api-access-8rnww" (OuterVolumeSpecName: "kube-api-access-8rnww") pod "97ae4aed-b05e-41bb-9ff6-16d6eb3c78b8" (UID: "97ae4aed-b05e-41bb-9ff6-16d6eb3c78b8"). InnerVolumeSpecName "kube-api-access-8rnww". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:30:02 crc kubenswrapper[4698]: I0224 10:30:02.831283 4698 ???:1] "http: TLS handshake error from 192.168.126.11:36014: no serving certificate available for the kubelet" Feb 24 10:30:02 crc kubenswrapper[4698]: I0224 10:30:02.843874 4698 ???:1] "http: TLS handshake error from 192.168.126.11:36028: no serving certificate available for the kubelet" Feb 24 10:30:02 crc kubenswrapper[4698]: I0224 10:30:02.903152 4698 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/97ae4aed-b05e-41bb-9ff6-16d6eb3c78b8-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 24 10:30:02 crc kubenswrapper[4698]: I0224 10:30:02.903201 4698 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/97ae4aed-b05e-41bb-9ff6-16d6eb3c78b8-config-volume\") on node \"crc\" DevicePath \"\"" Feb 24 10:30:02 crc kubenswrapper[4698]: I0224 10:30:02.903222 4698 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8rnww\" (UniqueName: \"kubernetes.io/projected/97ae4aed-b05e-41bb-9ff6-16d6eb3c78b8-kube-api-access-8rnww\") on node \"crc\" DevicePath \"\"" Feb 24 10:30:03 crc kubenswrapper[4698]: I0224 10:30:03.496010 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29532150-ftznz" event={"ID":"97ae4aed-b05e-41bb-9ff6-16d6eb3c78b8","Type":"ContainerDied","Data":"2a6039787cf7952065afe3797ce7aff3b3a4cc50b38d0cbf13a221e77c9781f0"} Feb 24 10:30:03 crc kubenswrapper[4698]: I0224 10:30:03.496046 4698 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a6039787cf7952065afe3797ce7aff3b3a4cc50b38d0cbf13a221e77c9781f0" Feb 24 10:30:03 crc kubenswrapper[4698]: I0224 10:30:03.496090 4698 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29532150-ftznz" Feb 24 10:30:03 crc kubenswrapper[4698]: I0224 10:30:03.982619 4698 ???:1] "http: TLS handshake error from 192.168.126.11:36042: no serving certificate available for the kubelet" Feb 24 10:30:03 crc kubenswrapper[4698]: I0224 10:30:03.995027 4698 ???:1] "http: TLS handshake error from 192.168.126.11:36046: no serving certificate available for the kubelet" Feb 24 10:30:05 crc kubenswrapper[4698]: I0224 10:30:05.117066 4698 ???:1] "http: TLS handshake error from 192.168.126.11:36052: no serving certificate available for the kubelet" Feb 24 10:30:05 crc kubenswrapper[4698]: I0224 10:30:05.132421 4698 ???:1] "http: TLS handshake error from 192.168.126.11:36068: no serving certificate available for the kubelet" Feb 24 10:30:06 crc kubenswrapper[4698]: I0224 10:30:06.283194 4698 ???:1] "http: TLS handshake error from 192.168.126.11:36080: no serving certificate available for the kubelet" Feb 24 10:30:06 crc kubenswrapper[4698]: I0224 10:30:06.294619 4698 ???:1] "http: TLS handshake error from 192.168.126.11:36096: no serving certificate available for the kubelet" Feb 24 10:30:08 crc kubenswrapper[4698]: I0224 10:30:08.541181 4698 ???:1] "http: TLS handshake error from 192.168.126.11:36100: no serving certificate available for the kubelet" Feb 24 10:30:08 crc kubenswrapper[4698]: I0224 10:30:08.571078 4698 ???:1] "http: TLS handshake error from 192.168.126.11:36106: no serving certificate available for the kubelet" Feb 24 10:30:09 crc kubenswrapper[4698]: I0224 10:30:09.722303 4698 ???:1] "http: TLS handshake error from 192.168.126.11:36108: no serving certificate available for the kubelet" Feb 24 10:30:09 crc kubenswrapper[4698]: I0224 10:30:09.736530 4698 ???:1] "http: TLS handshake error from 192.168.126.11:36122: no serving certificate available for the kubelet" Feb 24 10:30:10 crc kubenswrapper[4698]: I0224 10:30:10.870409 4698 ???:1] "http: TLS handshake error from 192.168.126.11:36124: no serving certificate available for the kubelet" Feb 24 10:30:10 crc kubenswrapper[4698]: I0224 10:30:10.881374 4698 ???:1] "http: TLS handshake error from 192.168.126.11:36138: no serving certificate available for the kubelet" Feb 24 10:30:12 crc kubenswrapper[4698]: I0224 10:30:12.027455 4698 ???:1] "http: TLS handshake error from 192.168.126.11:59210: no serving certificate available for the kubelet" Feb 24 10:30:12 crc kubenswrapper[4698]: I0224 10:30:12.040932 4698 ???:1] "http: TLS handshake error from 192.168.126.11:59222: no serving certificate available for the kubelet" Feb 24 10:30:13 crc kubenswrapper[4698]: I0224 10:30:13.213573 4698 ???:1] "http: TLS handshake error from 192.168.126.11:59238: no serving certificate available for the kubelet" Feb 24 10:30:13 crc kubenswrapper[4698]: I0224 10:30:13.229240 4698 ???:1] "http: TLS handshake error from 192.168.126.11:59244: no serving certificate available for the kubelet" Feb 24 10:30:14 crc kubenswrapper[4698]: I0224 10:30:14.344640 4698 ???:1] "http: TLS handshake error from 192.168.126.11:59246: no serving certificate available for the kubelet" Feb 24 10:30:14 crc kubenswrapper[4698]: I0224 10:30:14.358797 4698 ???:1] "http: TLS handshake error from 192.168.126.11:59254: no serving certificate available for the kubelet" Feb 24 10:30:15 crc kubenswrapper[4698]: I0224 10:30:15.481640 4698 ???:1] "http: TLS handshake error from 192.168.126.11:59270: no serving certificate available for the kubelet" Feb 24 10:30:15 crc kubenswrapper[4698]: I0224 10:30:15.496687 4698 ???:1] "http: TLS handshake error from 192.168.126.11:59278: no serving certificate available for the kubelet" Feb 24 10:30:16 crc kubenswrapper[4698]: I0224 10:30:16.630068 4698 ???:1] "http: TLS handshake error from 192.168.126.11:59282: no serving certificate available for the kubelet" Feb 24 10:30:16 crc kubenswrapper[4698]: I0224 10:30:16.645135 4698 ???:1] "http: TLS handshake error from 192.168.126.11:59298: no serving certificate available for the kubelet" Feb 24 10:30:16 crc kubenswrapper[4698]: I0224 10:30:16.800063 4698 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 24 10:30:17 crc kubenswrapper[4698]: I0224 10:30:17.772491 4698 ???:1] "http: TLS handshake error from 192.168.126.11:59300: no serving certificate available for the kubelet" Feb 24 10:30:17 crc kubenswrapper[4698]: I0224 10:30:17.785489 4698 ???:1] "http: TLS handshake error from 192.168.126.11:59310: no serving certificate available for the kubelet" Feb 24 10:30:18 crc kubenswrapper[4698]: I0224 10:30:18.947483 4698 ???:1] "http: TLS handshake error from 192.168.126.11:59314: no serving certificate available for the kubelet" Feb 24 10:30:18 crc kubenswrapper[4698]: I0224 10:30:18.966566 4698 ???:1] "http: TLS handshake error from 192.168.126.11:59328: no serving certificate available for the kubelet" Feb 24 10:30:20 crc kubenswrapper[4698]: I0224 10:30:20.635246 4698 ???:1] "http: TLS handshake error from 192.168.126.11:59344: no serving certificate available for the kubelet" Feb 24 10:30:20 crc kubenswrapper[4698]: I0224 10:30:20.648181 4698 ???:1] "http: TLS handshake error from 192.168.126.11:59358: no serving certificate available for the kubelet" Feb 24 10:30:21 crc kubenswrapper[4698]: I0224 10:30:21.826693 4698 ???:1] "http: TLS handshake error from 192.168.126.11:33626: no serving certificate available for the kubelet" Feb 24 10:30:21 crc kubenswrapper[4698]: I0224 10:30:21.841873 4698 ???:1] "http: TLS handshake error from 192.168.126.11:33634: no serving certificate available for the kubelet" Feb 24 10:30:22 crc kubenswrapper[4698]: I0224 10:30:22.196842 4698 patch_prober.go:28] interesting pod/machine-config-daemon-nn578 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 10:30:22 crc kubenswrapper[4698]: I0224 10:30:22.196919 4698 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nn578" podUID="b4ee0bb1-125d-4852-a54d-7dadf6177545" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 10:30:22 crc kubenswrapper[4698]: I0224 10:30:22.196985 4698 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-nn578" Feb 24 10:30:22 crc kubenswrapper[4698]: I0224 10:30:22.197775 4698 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"829b9213c4c673d3133873424826a5ea12ee4cbf361962bd4c39f0f65c6f48c4"} pod="openshift-machine-config-operator/machine-config-daemon-nn578" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 24 10:30:22 crc kubenswrapper[4698]: I0224 10:30:22.197851 4698 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-nn578" podUID="b4ee0bb1-125d-4852-a54d-7dadf6177545" containerName="machine-config-daemon" containerID="cri-o://829b9213c4c673d3133873424826a5ea12ee4cbf361962bd4c39f0f65c6f48c4" gracePeriod=600 Feb 24 10:30:22 crc kubenswrapper[4698]: I0224 10:30:22.617849 4698 generic.go:334] "Generic (PLEG): container finished" podID="b4ee0bb1-125d-4852-a54d-7dadf6177545" containerID="829b9213c4c673d3133873424826a5ea12ee4cbf361962bd4c39f0f65c6f48c4" exitCode=0 Feb 24 10:30:22 crc kubenswrapper[4698]: I0224 10:30:22.618045 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-nn578" event={"ID":"b4ee0bb1-125d-4852-a54d-7dadf6177545","Type":"ContainerDied","Data":"829b9213c4c673d3133873424826a5ea12ee4cbf361962bd4c39f0f65c6f48c4"} Feb 24 10:30:22 crc kubenswrapper[4698]: I0224 10:30:22.618155 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-nn578" event={"ID":"b4ee0bb1-125d-4852-a54d-7dadf6177545","Type":"ContainerStarted","Data":"5b500e3b8410dff824193492887fc096fc35e4773517369eceee59151bac59ea"} Feb 24 10:30:22 crc kubenswrapper[4698]: I0224 10:30:22.618181 4698 scope.go:117] "RemoveContainer" containerID="0b7da0d5fae2f1471fcf65125ad5cf893f00a676ecd1a2c2a431023ddbdfc83e" Feb 24 10:30:23 crc kubenswrapper[4698]: I0224 10:30:23.030506 4698 ???:1] "http: TLS handshake error from 192.168.126.11:33638: no serving certificate available for the kubelet" Feb 24 10:30:23 crc kubenswrapper[4698]: I0224 10:30:23.045439 4698 ???:1] "http: TLS handshake error from 192.168.126.11:33642: no serving certificate available for the kubelet" Feb 24 10:30:24 crc kubenswrapper[4698]: I0224 10:30:24.204444 4698 ???:1] "http: TLS handshake error from 192.168.126.11:33652: no serving certificate available for the kubelet" Feb 24 10:30:24 crc kubenswrapper[4698]: I0224 10:30:24.219026 4698 ???:1] "http: TLS handshake error from 192.168.126.11:33662: no serving certificate available for the kubelet" Feb 24 10:30:25 crc kubenswrapper[4698]: I0224 10:30:25.369365 4698 ???:1] "http: TLS handshake error from 192.168.126.11:33678: no serving certificate available for the kubelet" Feb 24 10:30:25 crc kubenswrapper[4698]: I0224 10:30:25.386045 4698 ???:1] "http: TLS handshake error from 192.168.126.11:33690: no serving certificate available for the kubelet" Feb 24 10:30:26 crc kubenswrapper[4698]: I0224 10:30:26.563477 4698 ???:1] "http: TLS handshake error from 192.168.126.11:33702: no serving certificate available for the kubelet" Feb 24 10:30:26 crc kubenswrapper[4698]: I0224 10:30:26.580783 4698 ???:1] "http: TLS handshake error from 192.168.126.11:33712: no serving certificate available for the kubelet" Feb 24 10:30:27 crc kubenswrapper[4698]: I0224 10:30:27.747094 4698 ???:1] "http: TLS handshake error from 192.168.126.11:33722: no serving certificate available for the kubelet" Feb 24 10:30:27 crc kubenswrapper[4698]: I0224 10:30:27.761068 4698 ???:1] "http: TLS handshake error from 192.168.126.11:33736: no serving certificate available for the kubelet" Feb 24 10:30:28 crc kubenswrapper[4698]: I0224 10:30:28.889410 4698 ???:1] "http: TLS handshake error from 192.168.126.11:33748: no serving certificate available for the kubelet" Feb 24 10:30:28 crc kubenswrapper[4698]: I0224 10:30:28.907575 4698 ???:1] "http: TLS handshake error from 192.168.126.11:33756: no serving certificate available for the kubelet" Feb 24 10:30:30 crc kubenswrapper[4698]: I0224 10:30:30.033524 4698 ???:1] "http: TLS handshake error from 192.168.126.11:33760: no serving certificate available for the kubelet" Feb 24 10:30:30 crc kubenswrapper[4698]: I0224 10:30:30.048859 4698 ???:1] "http: TLS handshake error from 192.168.126.11:33762: no serving certificate available for the kubelet" Feb 24 10:30:31 crc kubenswrapper[4698]: I0224 10:30:31.176710 4698 ???:1] "http: TLS handshake error from 192.168.126.11:33778: no serving certificate available for the kubelet" Feb 24 10:30:31 crc kubenswrapper[4698]: I0224 10:30:31.190661 4698 ???:1] "http: TLS handshake error from 192.168.126.11:33780: no serving certificate available for the kubelet" Feb 24 10:30:32 crc kubenswrapper[4698]: I0224 10:30:32.317553 4698 ???:1] "http: TLS handshake error from 192.168.126.11:58322: no serving certificate available for the kubelet" Feb 24 10:30:32 crc kubenswrapper[4698]: I0224 10:30:32.330146 4698 ???:1] "http: TLS handshake error from 192.168.126.11:58336: no serving certificate available for the kubelet" Feb 24 10:30:33 crc kubenswrapper[4698]: I0224 10:30:33.484312 4698 ???:1] "http: TLS handshake error from 192.168.126.11:58342: no serving certificate available for the kubelet" Feb 24 10:30:33 crc kubenswrapper[4698]: I0224 10:30:33.501613 4698 ???:1] "http: TLS handshake error from 192.168.126.11:58346: no serving certificate available for the kubelet" Feb 24 10:30:34 crc kubenswrapper[4698]: I0224 10:30:34.679832 4698 ???:1] "http: TLS handshake error from 192.168.126.11:58358: no serving certificate available for the kubelet" Feb 24 10:30:34 crc kubenswrapper[4698]: I0224 10:30:34.695079 4698 ???:1] "http: TLS handshake error from 192.168.126.11:58370: no serving certificate available for the kubelet" Feb 24 10:30:35 crc kubenswrapper[4698]: I0224 10:30:35.832714 4698 ???:1] "http: TLS handshake error from 192.168.126.11:58380: no serving certificate available for the kubelet" Feb 24 10:30:35 crc kubenswrapper[4698]: I0224 10:30:35.849637 4698 ???:1] "http: TLS handshake error from 192.168.126.11:58392: no serving certificate available for the kubelet" Feb 24 10:30:36 crc kubenswrapper[4698]: I0224 10:30:36.996694 4698 ???:1] "http: TLS handshake error from 192.168.126.11:58394: no serving certificate available for the kubelet" Feb 24 10:30:37 crc kubenswrapper[4698]: I0224 10:30:37.022666 4698 ???:1] "http: TLS handshake error from 192.168.126.11:58404: no serving certificate available for the kubelet" Feb 24 10:30:38 crc kubenswrapper[4698]: I0224 10:30:38.143900 4698 ???:1] "http: TLS handshake error from 192.168.126.11:58414: no serving certificate available for the kubelet" Feb 24 10:30:38 crc kubenswrapper[4698]: I0224 10:30:38.156909 4698 ???:1] "http: TLS handshake error from 192.168.126.11:58426: no serving certificate available for the kubelet" Feb 24 10:30:39 crc kubenswrapper[4698]: I0224 10:30:39.314030 4698 ???:1] "http: TLS handshake error from 192.168.126.11:58432: no serving certificate available for the kubelet" Feb 24 10:30:39 crc kubenswrapper[4698]: I0224 10:30:39.328747 4698 ???:1] "http: TLS handshake error from 192.168.126.11:58448: no serving certificate available for the kubelet" Feb 24 10:30:40 crc kubenswrapper[4698]: I0224 10:30:40.459885 4698 ???:1] "http: TLS handshake error from 192.168.126.11:58460: no serving certificate available for the kubelet" Feb 24 10:30:40 crc kubenswrapper[4698]: I0224 10:30:40.476231 4698 ???:1] "http: TLS handshake error from 192.168.126.11:58468: no serving certificate available for the kubelet" Feb 24 10:30:41 crc kubenswrapper[4698]: I0224 10:30:41.720160 4698 ???:1] "http: TLS handshake error from 192.168.126.11:42528: no serving certificate available for the kubelet" Feb 24 10:30:41 crc kubenswrapper[4698]: I0224 10:30:41.739712 4698 ???:1] "http: TLS handshake error from 192.168.126.11:42544: no serving certificate available for the kubelet" Feb 24 10:30:42 crc kubenswrapper[4698]: I0224 10:30:42.914287 4698 ???:1] "http: TLS handshake error from 192.168.126.11:42554: no serving certificate available for the kubelet" Feb 24 10:30:42 crc kubenswrapper[4698]: I0224 10:30:42.928792 4698 ???:1] "http: TLS handshake error from 192.168.126.11:42564: no serving certificate available for the kubelet" Feb 24 10:30:44 crc kubenswrapper[4698]: I0224 10:30:44.103442 4698 ???:1] "http: TLS handshake error from 192.168.126.11:42570: no serving certificate available for the kubelet" Feb 24 10:30:44 crc kubenswrapper[4698]: I0224 10:30:44.121674 4698 ???:1] "http: TLS handshake error from 192.168.126.11:42572: no serving certificate available for the kubelet" Feb 24 10:30:45 crc kubenswrapper[4698]: I0224 10:30:45.334112 4698 ???:1] "http: TLS handshake error from 192.168.126.11:42578: no serving certificate available for the kubelet" Feb 24 10:30:45 crc kubenswrapper[4698]: I0224 10:30:45.355440 4698 ???:1] "http: TLS handshake error from 192.168.126.11:42586: no serving certificate available for the kubelet" Feb 24 10:31:16 crc kubenswrapper[4698]: I0224 10:31:16.127996 4698 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vjgsb/must-gather-m5gz6"] Feb 24 10:31:16 crc kubenswrapper[4698]: E0224 10:31:16.128949 4698 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97ae4aed-b05e-41bb-9ff6-16d6eb3c78b8" containerName="collect-profiles" Feb 24 10:31:16 crc kubenswrapper[4698]: I0224 10:31:16.128972 4698 state_mem.go:107] "Deleted CPUSet assignment" podUID="97ae4aed-b05e-41bb-9ff6-16d6eb3c78b8" containerName="collect-profiles" Feb 24 10:31:16 crc kubenswrapper[4698]: I0224 10:31:16.129169 4698 memory_manager.go:354] "RemoveStaleState removing state" podUID="97ae4aed-b05e-41bb-9ff6-16d6eb3c78b8" containerName="collect-profiles" Feb 24 10:31:16 crc kubenswrapper[4698]: I0224 10:31:16.130121 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vjgsb/must-gather-m5gz6" Feb 24 10:31:16 crc kubenswrapper[4698]: I0224 10:31:16.134182 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-vjgsb"/"openshift-service-ca.crt" Feb 24 10:31:16 crc kubenswrapper[4698]: I0224 10:31:16.134983 4698 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-vjgsb"/"kube-root-ca.crt" Feb 24 10:31:16 crc kubenswrapper[4698]: I0224 10:31:16.151638 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vjgsb/must-gather-m5gz6"] Feb 24 10:31:16 crc kubenswrapper[4698]: I0224 10:31:16.242933 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e026ae3f-b95b-4b97-ac97-04c34a90bcca-must-gather-output\") pod \"must-gather-m5gz6\" (UID: \"e026ae3f-b95b-4b97-ac97-04c34a90bcca\") " pod="openshift-must-gather-vjgsb/must-gather-m5gz6" Feb 24 10:31:16 crc kubenswrapper[4698]: I0224 10:31:16.243056 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkvqm\" (UniqueName: \"kubernetes.io/projected/e026ae3f-b95b-4b97-ac97-04c34a90bcca-kube-api-access-zkvqm\") pod \"must-gather-m5gz6\" (UID: \"e026ae3f-b95b-4b97-ac97-04c34a90bcca\") " pod="openshift-must-gather-vjgsb/must-gather-m5gz6" Feb 24 10:31:16 crc kubenswrapper[4698]: I0224 10:31:16.344411 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e026ae3f-b95b-4b97-ac97-04c34a90bcca-must-gather-output\") pod \"must-gather-m5gz6\" (UID: \"e026ae3f-b95b-4b97-ac97-04c34a90bcca\") " pod="openshift-must-gather-vjgsb/must-gather-m5gz6" Feb 24 10:31:16 crc kubenswrapper[4698]: I0224 10:31:16.344530 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkvqm\" (UniqueName: \"kubernetes.io/projected/e026ae3f-b95b-4b97-ac97-04c34a90bcca-kube-api-access-zkvqm\") pod \"must-gather-m5gz6\" (UID: \"e026ae3f-b95b-4b97-ac97-04c34a90bcca\") " pod="openshift-must-gather-vjgsb/must-gather-m5gz6" Feb 24 10:31:16 crc kubenswrapper[4698]: I0224 10:31:16.344934 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e026ae3f-b95b-4b97-ac97-04c34a90bcca-must-gather-output\") pod \"must-gather-m5gz6\" (UID: \"e026ae3f-b95b-4b97-ac97-04c34a90bcca\") " pod="openshift-must-gather-vjgsb/must-gather-m5gz6" Feb 24 10:31:16 crc kubenswrapper[4698]: I0224 10:31:16.368702 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkvqm\" (UniqueName: \"kubernetes.io/projected/e026ae3f-b95b-4b97-ac97-04c34a90bcca-kube-api-access-zkvqm\") pod \"must-gather-m5gz6\" (UID: \"e026ae3f-b95b-4b97-ac97-04c34a90bcca\") " pod="openshift-must-gather-vjgsb/must-gather-m5gz6" Feb 24 10:31:16 crc kubenswrapper[4698]: I0224 10:31:16.457573 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vjgsb/must-gather-m5gz6" Feb 24 10:31:16 crc kubenswrapper[4698]: I0224 10:31:16.691980 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vjgsb/must-gather-m5gz6"] Feb 24 10:31:16 crc kubenswrapper[4698]: I0224 10:31:16.977476 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vjgsb/must-gather-m5gz6" event={"ID":"e026ae3f-b95b-4b97-ac97-04c34a90bcca","Type":"ContainerStarted","Data":"8aced5f31ffe0beac7e356954930839e67855d266f2b951c7369d9565e2d4cc5"} Feb 24 10:31:26 crc kubenswrapper[4698]: I0224 10:31:26.038821 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vjgsb/must-gather-m5gz6" event={"ID":"e026ae3f-b95b-4b97-ac97-04c34a90bcca","Type":"ContainerStarted","Data":"9ea1992c072c411564c741b79f164eb5fd51a77cb7dbb039365489b4dd81d475"} Feb 24 10:31:26 crc kubenswrapper[4698]: I0224 10:31:26.039480 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vjgsb/must-gather-m5gz6" event={"ID":"e026ae3f-b95b-4b97-ac97-04c34a90bcca","Type":"ContainerStarted","Data":"302f9c37d4de708e7856de8e63b315d70a13afdbff2554c082db09e34396067d"} Feb 24 10:31:26 crc kubenswrapper[4698]: I0224 10:31:26.068300 4698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-vjgsb/must-gather-m5gz6" podStartSLOduration=1.934163811 podStartE2EDuration="10.06828472s" podCreationTimestamp="2026-02-24 10:31:16 +0000 UTC" firstStartedPulling="2026-02-24 10:31:16.70472576 +0000 UTC m=+901.818340011" lastFinishedPulling="2026-02-24 10:31:24.838846679 +0000 UTC m=+909.952460920" observedRunningTime="2026-02-24 10:31:26.065980685 +0000 UTC m=+911.179594916" watchObservedRunningTime="2026-02-24 10:31:26.06828472 +0000 UTC m=+911.181898961" Feb 24 10:31:26 crc kubenswrapper[4698]: I0224 10:31:26.120029 4698 ???:1] "http: TLS handshake error from 192.168.126.11:47586: no serving certificate available for the kubelet" Feb 24 10:31:39 crc kubenswrapper[4698]: I0224 10:31:39.675791 4698 ???:1] "http: TLS handshake error from 192.168.126.11:43822: no serving certificate available for the kubelet" Feb 24 10:32:04 crc kubenswrapper[4698]: I0224 10:32:04.122766 4698 ???:1] "http: TLS handshake error from 192.168.126.11:45830: no serving certificate available for the kubelet" Feb 24 10:32:04 crc kubenswrapper[4698]: I0224 10:32:04.209206 4698 ???:1] "http: TLS handshake error from 192.168.126.11:45846: no serving certificate available for the kubelet" Feb 24 10:32:04 crc kubenswrapper[4698]: I0224 10:32:04.215162 4698 ???:1] "http: TLS handshake error from 192.168.126.11:45852: no serving certificate available for the kubelet" Feb 24 10:32:15 crc kubenswrapper[4698]: I0224 10:32:15.987064 4698 ???:1] "http: TLS handshake error from 192.168.126.11:41754: no serving certificate available for the kubelet" Feb 24 10:32:16 crc kubenswrapper[4698]: I0224 10:32:16.115597 4698 ???:1] "http: TLS handshake error from 192.168.126.11:41762: no serving certificate available for the kubelet" Feb 24 10:32:16 crc kubenswrapper[4698]: I0224 10:32:16.154544 4698 ???:1] "http: TLS handshake error from 192.168.126.11:41770: no serving certificate available for the kubelet" Feb 24 10:32:22 crc kubenswrapper[4698]: I0224 10:32:22.196389 4698 patch_prober.go:28] interesting pod/machine-config-daemon-nn578 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 10:32:22 crc kubenswrapper[4698]: I0224 10:32:22.196747 4698 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nn578" podUID="b4ee0bb1-125d-4852-a54d-7dadf6177545" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 10:32:39 crc kubenswrapper[4698]: I0224 10:32:39.839867 4698 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5wnls"] Feb 24 10:32:39 crc kubenswrapper[4698]: I0224 10:32:39.843125 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5wnls" Feb 24 10:32:39 crc kubenswrapper[4698]: I0224 10:32:39.855475 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5wnls"] Feb 24 10:32:39 crc kubenswrapper[4698]: I0224 10:32:39.987337 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71a78aac-bc43-4f51-9aff-a9a6bbe1c499-catalog-content\") pod \"redhat-marketplace-5wnls\" (UID: \"71a78aac-bc43-4f51-9aff-a9a6bbe1c499\") " pod="openshift-marketplace/redhat-marketplace-5wnls" Feb 24 10:32:39 crc kubenswrapper[4698]: I0224 10:32:39.987393 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71a78aac-bc43-4f51-9aff-a9a6bbe1c499-utilities\") pod \"redhat-marketplace-5wnls\" (UID: \"71a78aac-bc43-4f51-9aff-a9a6bbe1c499\") " pod="openshift-marketplace/redhat-marketplace-5wnls" Feb 24 10:32:39 crc kubenswrapper[4698]: I0224 10:32:39.987440 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqfnw\" (UniqueName: \"kubernetes.io/projected/71a78aac-bc43-4f51-9aff-a9a6bbe1c499-kube-api-access-bqfnw\") pod \"redhat-marketplace-5wnls\" (UID: \"71a78aac-bc43-4f51-9aff-a9a6bbe1c499\") " pod="openshift-marketplace/redhat-marketplace-5wnls" Feb 24 10:32:40 crc kubenswrapper[4698]: I0224 10:32:40.088914 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqfnw\" (UniqueName: \"kubernetes.io/projected/71a78aac-bc43-4f51-9aff-a9a6bbe1c499-kube-api-access-bqfnw\") pod \"redhat-marketplace-5wnls\" (UID: \"71a78aac-bc43-4f51-9aff-a9a6bbe1c499\") " pod="openshift-marketplace/redhat-marketplace-5wnls" Feb 24 10:32:40 crc kubenswrapper[4698]: I0224 10:32:40.089024 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71a78aac-bc43-4f51-9aff-a9a6bbe1c499-catalog-content\") pod \"redhat-marketplace-5wnls\" (UID: \"71a78aac-bc43-4f51-9aff-a9a6bbe1c499\") " pod="openshift-marketplace/redhat-marketplace-5wnls" Feb 24 10:32:40 crc kubenswrapper[4698]: I0224 10:32:40.089047 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71a78aac-bc43-4f51-9aff-a9a6bbe1c499-utilities\") pod \"redhat-marketplace-5wnls\" (UID: \"71a78aac-bc43-4f51-9aff-a9a6bbe1c499\") " pod="openshift-marketplace/redhat-marketplace-5wnls" Feb 24 10:32:40 crc kubenswrapper[4698]: I0224 10:32:40.089509 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71a78aac-bc43-4f51-9aff-a9a6bbe1c499-utilities\") pod \"redhat-marketplace-5wnls\" (UID: \"71a78aac-bc43-4f51-9aff-a9a6bbe1c499\") " pod="openshift-marketplace/redhat-marketplace-5wnls" Feb 24 10:32:40 crc kubenswrapper[4698]: I0224 10:32:40.089623 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71a78aac-bc43-4f51-9aff-a9a6bbe1c499-catalog-content\") pod \"redhat-marketplace-5wnls\" (UID: \"71a78aac-bc43-4f51-9aff-a9a6bbe1c499\") " pod="openshift-marketplace/redhat-marketplace-5wnls" Feb 24 10:32:40 crc kubenswrapper[4698]: I0224 10:32:40.121975 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqfnw\" (UniqueName: \"kubernetes.io/projected/71a78aac-bc43-4f51-9aff-a9a6bbe1c499-kube-api-access-bqfnw\") pod \"redhat-marketplace-5wnls\" (UID: \"71a78aac-bc43-4f51-9aff-a9a6bbe1c499\") " pod="openshift-marketplace/redhat-marketplace-5wnls" Feb 24 10:32:40 crc kubenswrapper[4698]: I0224 10:32:40.164408 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5wnls" Feb 24 10:32:40 crc kubenswrapper[4698]: I0224 10:32:40.382253 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5wnls"] Feb 24 10:32:40 crc kubenswrapper[4698]: W0224 10:32:40.388406 4698 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71a78aac_bc43_4f51_9aff_a9a6bbe1c499.slice/crio-c140d4fd4c2aec0f9221457716bd25dc1fff9ac72580586c6592b94f4d11c485 WatchSource:0}: Error finding container c140d4fd4c2aec0f9221457716bd25dc1fff9ac72580586c6592b94f4d11c485: Status 404 returned error can't find the container with id c140d4fd4c2aec0f9221457716bd25dc1fff9ac72580586c6592b94f4d11c485 Feb 24 10:32:40 crc kubenswrapper[4698]: I0224 10:32:40.450913 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5wnls" event={"ID":"71a78aac-bc43-4f51-9aff-a9a6bbe1c499","Type":"ContainerStarted","Data":"c140d4fd4c2aec0f9221457716bd25dc1fff9ac72580586c6592b94f4d11c485"} Feb 24 10:32:41 crc kubenswrapper[4698]: I0224 10:32:41.460842 4698 generic.go:334] "Generic (PLEG): container finished" podID="71a78aac-bc43-4f51-9aff-a9a6bbe1c499" containerID="d047c171907e71d4c1e91f760154a660566520e3894bf47ab5e0836c4005d49f" exitCode=0 Feb 24 10:32:41 crc kubenswrapper[4698]: I0224 10:32:41.460977 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5wnls" event={"ID":"71a78aac-bc43-4f51-9aff-a9a6bbe1c499","Type":"ContainerDied","Data":"d047c171907e71d4c1e91f760154a660566520e3894bf47ab5e0836c4005d49f"} Feb 24 10:32:41 crc kubenswrapper[4698]: I0224 10:32:41.464500 4698 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 24 10:32:43 crc kubenswrapper[4698]: I0224 10:32:43.477090 4698 generic.go:334] "Generic (PLEG): container finished" podID="71a78aac-bc43-4f51-9aff-a9a6bbe1c499" containerID="2f2800082ffaa8a78ed7a143486f07a5332e6499dfdf2b1fec45af563869678b" exitCode=0 Feb 24 10:32:43 crc kubenswrapper[4698]: I0224 10:32:43.477246 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5wnls" event={"ID":"71a78aac-bc43-4f51-9aff-a9a6bbe1c499","Type":"ContainerDied","Data":"2f2800082ffaa8a78ed7a143486f07a5332e6499dfdf2b1fec45af563869678b"} Feb 24 10:32:43 crc kubenswrapper[4698]: I0224 10:32:43.577473 4698 ???:1] "http: TLS handshake error from 192.168.126.11:47166: no serving certificate available for the kubelet" Feb 24 10:32:43 crc kubenswrapper[4698]: I0224 10:32:43.724028 4698 ???:1] "http: TLS handshake error from 192.168.126.11:47168: no serving certificate available for the kubelet" Feb 24 10:32:43 crc kubenswrapper[4698]: I0224 10:32:43.747969 4698 ???:1] "http: TLS handshake error from 192.168.126.11:47172: no serving certificate available for the kubelet" Feb 24 10:32:43 crc kubenswrapper[4698]: I0224 10:32:43.750251 4698 ???:1] "http: TLS handshake error from 192.168.126.11:47180: no serving certificate available for the kubelet" Feb 24 10:32:43 crc kubenswrapper[4698]: I0224 10:32:43.913227 4698 ???:1] "http: TLS handshake error from 192.168.126.11:47194: no serving certificate available for the kubelet" Feb 24 10:32:43 crc kubenswrapper[4698]: I0224 10:32:43.915429 4698 ???:1] "http: TLS handshake error from 192.168.126.11:47206: no serving certificate available for the kubelet" Feb 24 10:32:43 crc kubenswrapper[4698]: I0224 10:32:43.947768 4698 ???:1] "http: TLS handshake error from 192.168.126.11:47214: no serving certificate available for the kubelet" Feb 24 10:32:44 crc kubenswrapper[4698]: I0224 10:32:44.076066 4698 ???:1] "http: TLS handshake error from 192.168.126.11:47220: no serving certificate available for the kubelet" Feb 24 10:32:44 crc kubenswrapper[4698]: I0224 10:32:44.243251 4698 ???:1] "http: TLS handshake error from 192.168.126.11:47234: no serving certificate available for the kubelet" Feb 24 10:32:44 crc kubenswrapper[4698]: I0224 10:32:44.280771 4698 ???:1] "http: TLS handshake error from 192.168.126.11:47246: no serving certificate available for the kubelet" Feb 24 10:32:44 crc kubenswrapper[4698]: I0224 10:32:44.288983 4698 ???:1] "http: TLS handshake error from 192.168.126.11:47254: no serving certificate available for the kubelet" Feb 24 10:32:44 crc kubenswrapper[4698]: I0224 10:32:44.447793 4698 ???:1] "http: TLS handshake error from 192.168.126.11:47266: no serving certificate available for the kubelet" Feb 24 10:32:44 crc kubenswrapper[4698]: I0224 10:32:44.479015 4698 ???:1] "http: TLS handshake error from 192.168.126.11:47270: no serving certificate available for the kubelet" Feb 24 10:32:44 crc kubenswrapper[4698]: I0224 10:32:44.484940 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5wnls" event={"ID":"71a78aac-bc43-4f51-9aff-a9a6bbe1c499","Type":"ContainerStarted","Data":"4786a7c90f834e6fa4eeb712031c14e3caaf2337417b976d75929026a7774e0e"} Feb 24 10:32:44 crc kubenswrapper[4698]: I0224 10:32:44.510394 4698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5wnls" podStartSLOduration=3.125664665 podStartE2EDuration="5.510378971s" podCreationTimestamp="2026-02-24 10:32:39 +0000 UTC" firstStartedPulling="2026-02-24 10:32:41.463946136 +0000 UTC m=+986.577560417" lastFinishedPulling="2026-02-24 10:32:43.848660482 +0000 UTC m=+988.962274723" observedRunningTime="2026-02-24 10:32:44.505626826 +0000 UTC m=+989.619241067" watchObservedRunningTime="2026-02-24 10:32:44.510378971 +0000 UTC m=+989.623993212" Feb 24 10:32:44 crc kubenswrapper[4698]: I0224 10:32:44.528275 4698 ???:1] "http: TLS handshake error from 192.168.126.11:47278: no serving certificate available for the kubelet" Feb 24 10:32:44 crc kubenswrapper[4698]: I0224 10:32:44.623912 4698 ???:1] "http: TLS handshake error from 192.168.126.11:47290: no serving certificate available for the kubelet" Feb 24 10:32:44 crc kubenswrapper[4698]: I0224 10:32:44.660609 4698 ???:1] "http: TLS handshake error from 192.168.126.11:47304: no serving certificate available for the kubelet" Feb 24 10:32:44 crc kubenswrapper[4698]: I0224 10:32:44.827806 4698 ???:1] "http: TLS handshake error from 192.168.126.11:47306: no serving certificate available for the kubelet" Feb 24 10:32:44 crc kubenswrapper[4698]: I0224 10:32:44.884757 4698 ???:1] "http: TLS handshake error from 192.168.126.11:47318: no serving certificate available for the kubelet" Feb 24 10:32:44 crc kubenswrapper[4698]: I0224 10:32:44.888582 4698 ???:1] "http: TLS handshake error from 192.168.126.11:47326: no serving certificate available for the kubelet" Feb 24 10:32:45 crc kubenswrapper[4698]: I0224 10:32:45.033277 4698 ???:1] "http: TLS handshake error from 192.168.126.11:47330: no serving certificate available for the kubelet" Feb 24 10:32:45 crc kubenswrapper[4698]: I0224 10:32:45.042391 4698 ???:1] "http: TLS handshake error from 192.168.126.11:47344: no serving certificate available for the kubelet" Feb 24 10:32:45 crc kubenswrapper[4698]: I0224 10:32:45.083317 4698 ???:1] "http: TLS handshake error from 192.168.126.11:47358: no serving certificate available for the kubelet" Feb 24 10:32:45 crc kubenswrapper[4698]: I0224 10:32:45.242848 4698 ???:1] "http: TLS handshake error from 192.168.126.11:47364: no serving certificate available for the kubelet" Feb 24 10:32:45 crc kubenswrapper[4698]: I0224 10:32:45.396555 4698 ???:1] "http: TLS handshake error from 192.168.126.11:47380: no serving certificate available for the kubelet" Feb 24 10:32:45 crc kubenswrapper[4698]: I0224 10:32:45.427721 4698 ???:1] "http: TLS handshake error from 192.168.126.11:47394: no serving certificate available for the kubelet" Feb 24 10:32:45 crc kubenswrapper[4698]: I0224 10:32:45.433389 4698 ???:1] "http: TLS handshake error from 192.168.126.11:47400: no serving certificate available for the kubelet" Feb 24 10:32:45 crc kubenswrapper[4698]: I0224 10:32:45.577350 4698 ???:1] "http: TLS handshake error from 192.168.126.11:47414: no serving certificate available for the kubelet" Feb 24 10:32:45 crc kubenswrapper[4698]: I0224 10:32:45.622066 4698 ???:1] "http: TLS handshake error from 192.168.126.11:47416: no serving certificate available for the kubelet" Feb 24 10:32:45 crc kubenswrapper[4698]: I0224 10:32:45.624936 4698 ???:1] "http: TLS handshake error from 192.168.126.11:47420: no serving certificate available for the kubelet" Feb 24 10:32:45 crc kubenswrapper[4698]: I0224 10:32:45.759438 4698 ???:1] "http: TLS handshake error from 192.168.126.11:47430: no serving certificate available for the kubelet" Feb 24 10:32:45 crc kubenswrapper[4698]: I0224 10:32:45.884574 4698 ???:1] "http: TLS handshake error from 192.168.126.11:47442: no serving certificate available for the kubelet" Feb 24 10:32:45 crc kubenswrapper[4698]: I0224 10:32:45.888240 4698 ???:1] "http: TLS handshake error from 192.168.126.11:47456: no serving certificate available for the kubelet" Feb 24 10:32:45 crc kubenswrapper[4698]: I0224 10:32:45.900846 4698 ???:1] "http: TLS handshake error from 192.168.126.11:47460: no serving certificate available for the kubelet" Feb 24 10:32:46 crc kubenswrapper[4698]: I0224 10:32:46.050635 4698 ???:1] "http: TLS handshake error from 192.168.126.11:47474: no serving certificate available for the kubelet" Feb 24 10:32:46 crc kubenswrapper[4698]: I0224 10:32:46.083090 4698 ???:1] "http: TLS handshake error from 192.168.126.11:47480: no serving certificate available for the kubelet" Feb 24 10:32:46 crc kubenswrapper[4698]: I0224 10:32:46.087073 4698 ???:1] "http: TLS handshake error from 192.168.126.11:47482: no serving certificate available for the kubelet" Feb 24 10:32:50 crc kubenswrapper[4698]: I0224 10:32:50.164627 4698 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5wnls" Feb 24 10:32:50 crc kubenswrapper[4698]: I0224 10:32:50.165007 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5wnls" Feb 24 10:32:50 crc kubenswrapper[4698]: I0224 10:32:50.238681 4698 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5wnls" Feb 24 10:32:50 crc kubenswrapper[4698]: I0224 10:32:50.582225 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5wnls" Feb 24 10:32:50 crc kubenswrapper[4698]: I0224 10:32:50.653609 4698 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5wnls"] Feb 24 10:32:52 crc kubenswrapper[4698]: I0224 10:32:52.196494 4698 patch_prober.go:28] interesting pod/machine-config-daemon-nn578 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 10:32:52 crc kubenswrapper[4698]: I0224 10:32:52.196579 4698 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nn578" podUID="b4ee0bb1-125d-4852-a54d-7dadf6177545" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 10:32:52 crc kubenswrapper[4698]: I0224 10:32:52.532321 4698 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5wnls" podUID="71a78aac-bc43-4f51-9aff-a9a6bbe1c499" containerName="registry-server" containerID="cri-o://4786a7c90f834e6fa4eeb712031c14e3caaf2337417b976d75929026a7774e0e" gracePeriod=2 Feb 24 10:32:52 crc kubenswrapper[4698]: I0224 10:32:52.899239 4698 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5wnls" Feb 24 10:32:52 crc kubenswrapper[4698]: I0224 10:32:52.950898 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bqfnw\" (UniqueName: \"kubernetes.io/projected/71a78aac-bc43-4f51-9aff-a9a6bbe1c499-kube-api-access-bqfnw\") pod \"71a78aac-bc43-4f51-9aff-a9a6bbe1c499\" (UID: \"71a78aac-bc43-4f51-9aff-a9a6bbe1c499\") " Feb 24 10:32:52 crc kubenswrapper[4698]: I0224 10:32:52.950965 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71a78aac-bc43-4f51-9aff-a9a6bbe1c499-utilities\") pod \"71a78aac-bc43-4f51-9aff-a9a6bbe1c499\" (UID: \"71a78aac-bc43-4f51-9aff-a9a6bbe1c499\") " Feb 24 10:32:52 crc kubenswrapper[4698]: I0224 10:32:52.950992 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71a78aac-bc43-4f51-9aff-a9a6bbe1c499-catalog-content\") pod \"71a78aac-bc43-4f51-9aff-a9a6bbe1c499\" (UID: \"71a78aac-bc43-4f51-9aff-a9a6bbe1c499\") " Feb 24 10:32:52 crc kubenswrapper[4698]: I0224 10:32:52.952786 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71a78aac-bc43-4f51-9aff-a9a6bbe1c499-utilities" (OuterVolumeSpecName: "utilities") pod "71a78aac-bc43-4f51-9aff-a9a6bbe1c499" (UID: "71a78aac-bc43-4f51-9aff-a9a6bbe1c499"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 10:32:52 crc kubenswrapper[4698]: I0224 10:32:52.956091 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71a78aac-bc43-4f51-9aff-a9a6bbe1c499-kube-api-access-bqfnw" (OuterVolumeSpecName: "kube-api-access-bqfnw") pod "71a78aac-bc43-4f51-9aff-a9a6bbe1c499" (UID: "71a78aac-bc43-4f51-9aff-a9a6bbe1c499"). InnerVolumeSpecName "kube-api-access-bqfnw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:32:52 crc kubenswrapper[4698]: I0224 10:32:52.979204 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71a78aac-bc43-4f51-9aff-a9a6bbe1c499-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "71a78aac-bc43-4f51-9aff-a9a6bbe1c499" (UID: "71a78aac-bc43-4f51-9aff-a9a6bbe1c499"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 10:32:53 crc kubenswrapper[4698]: I0224 10:32:53.051904 4698 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71a78aac-bc43-4f51-9aff-a9a6bbe1c499-utilities\") on node \"crc\" DevicePath \"\"" Feb 24 10:32:53 crc kubenswrapper[4698]: I0224 10:32:53.051948 4698 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71a78aac-bc43-4f51-9aff-a9a6bbe1c499-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 24 10:32:53 crc kubenswrapper[4698]: I0224 10:32:53.051962 4698 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bqfnw\" (UniqueName: \"kubernetes.io/projected/71a78aac-bc43-4f51-9aff-a9a6bbe1c499-kube-api-access-bqfnw\") on node \"crc\" DevicePath \"\"" Feb 24 10:32:53 crc kubenswrapper[4698]: I0224 10:32:53.541558 4698 generic.go:334] "Generic (PLEG): container finished" podID="71a78aac-bc43-4f51-9aff-a9a6bbe1c499" containerID="4786a7c90f834e6fa4eeb712031c14e3caaf2337417b976d75929026a7774e0e" exitCode=0 Feb 24 10:32:53 crc kubenswrapper[4698]: I0224 10:32:53.541633 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5wnls" event={"ID":"71a78aac-bc43-4f51-9aff-a9a6bbe1c499","Type":"ContainerDied","Data":"4786a7c90f834e6fa4eeb712031c14e3caaf2337417b976d75929026a7774e0e"} Feb 24 10:32:53 crc kubenswrapper[4698]: I0224 10:32:53.541656 4698 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5wnls" Feb 24 10:32:53 crc kubenswrapper[4698]: I0224 10:32:53.542585 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5wnls" event={"ID":"71a78aac-bc43-4f51-9aff-a9a6bbe1c499","Type":"ContainerDied","Data":"c140d4fd4c2aec0f9221457716bd25dc1fff9ac72580586c6592b94f4d11c485"} Feb 24 10:32:53 crc kubenswrapper[4698]: I0224 10:32:53.542702 4698 scope.go:117] "RemoveContainer" containerID="4786a7c90f834e6fa4eeb712031c14e3caaf2337417b976d75929026a7774e0e" Feb 24 10:32:53 crc kubenswrapper[4698]: I0224 10:32:53.575379 4698 scope.go:117] "RemoveContainer" containerID="2f2800082ffaa8a78ed7a143486f07a5332e6499dfdf2b1fec45af563869678b" Feb 24 10:32:53 crc kubenswrapper[4698]: I0224 10:32:53.582604 4698 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5wnls"] Feb 24 10:32:53 crc kubenswrapper[4698]: I0224 10:32:53.587422 4698 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5wnls"] Feb 24 10:32:53 crc kubenswrapper[4698]: I0224 10:32:53.610444 4698 scope.go:117] "RemoveContainer" containerID="d047c171907e71d4c1e91f760154a660566520e3894bf47ab5e0836c4005d49f" Feb 24 10:32:53 crc kubenswrapper[4698]: I0224 10:32:53.634231 4698 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71a78aac-bc43-4f51-9aff-a9a6bbe1c499" path="/var/lib/kubelet/pods/71a78aac-bc43-4f51-9aff-a9a6bbe1c499/volumes" Feb 24 10:32:53 crc kubenswrapper[4698]: I0224 10:32:53.636707 4698 scope.go:117] "RemoveContainer" containerID="4786a7c90f834e6fa4eeb712031c14e3caaf2337417b976d75929026a7774e0e" Feb 24 10:32:53 crc kubenswrapper[4698]: E0224 10:32:53.637060 4698 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4786a7c90f834e6fa4eeb712031c14e3caaf2337417b976d75929026a7774e0e\": container with ID starting with 4786a7c90f834e6fa4eeb712031c14e3caaf2337417b976d75929026a7774e0e not found: ID does not exist" containerID="4786a7c90f834e6fa4eeb712031c14e3caaf2337417b976d75929026a7774e0e" Feb 24 10:32:53 crc kubenswrapper[4698]: I0224 10:32:53.637099 4698 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4786a7c90f834e6fa4eeb712031c14e3caaf2337417b976d75929026a7774e0e"} err="failed to get container status \"4786a7c90f834e6fa4eeb712031c14e3caaf2337417b976d75929026a7774e0e\": rpc error: code = NotFound desc = could not find container \"4786a7c90f834e6fa4eeb712031c14e3caaf2337417b976d75929026a7774e0e\": container with ID starting with 4786a7c90f834e6fa4eeb712031c14e3caaf2337417b976d75929026a7774e0e not found: ID does not exist" Feb 24 10:32:53 crc kubenswrapper[4698]: I0224 10:32:53.637130 4698 scope.go:117] "RemoveContainer" containerID="2f2800082ffaa8a78ed7a143486f07a5332e6499dfdf2b1fec45af563869678b" Feb 24 10:32:53 crc kubenswrapper[4698]: E0224 10:32:53.637530 4698 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f2800082ffaa8a78ed7a143486f07a5332e6499dfdf2b1fec45af563869678b\": container with ID starting with 2f2800082ffaa8a78ed7a143486f07a5332e6499dfdf2b1fec45af563869678b not found: ID does not exist" containerID="2f2800082ffaa8a78ed7a143486f07a5332e6499dfdf2b1fec45af563869678b" Feb 24 10:32:53 crc kubenswrapper[4698]: I0224 10:32:53.637598 4698 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f2800082ffaa8a78ed7a143486f07a5332e6499dfdf2b1fec45af563869678b"} err="failed to get container status \"2f2800082ffaa8a78ed7a143486f07a5332e6499dfdf2b1fec45af563869678b\": rpc error: code = NotFound desc = could not find container \"2f2800082ffaa8a78ed7a143486f07a5332e6499dfdf2b1fec45af563869678b\": container with ID starting with 2f2800082ffaa8a78ed7a143486f07a5332e6499dfdf2b1fec45af563869678b not found: ID does not exist" Feb 24 10:32:53 crc kubenswrapper[4698]: I0224 10:32:53.637642 4698 scope.go:117] "RemoveContainer" containerID="d047c171907e71d4c1e91f760154a660566520e3894bf47ab5e0836c4005d49f" Feb 24 10:32:53 crc kubenswrapper[4698]: E0224 10:32:53.638078 4698 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d047c171907e71d4c1e91f760154a660566520e3894bf47ab5e0836c4005d49f\": container with ID starting with d047c171907e71d4c1e91f760154a660566520e3894bf47ab5e0836c4005d49f not found: ID does not exist" containerID="d047c171907e71d4c1e91f760154a660566520e3894bf47ab5e0836c4005d49f" Feb 24 10:32:53 crc kubenswrapper[4698]: I0224 10:32:53.638105 4698 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d047c171907e71d4c1e91f760154a660566520e3894bf47ab5e0836c4005d49f"} err="failed to get container status \"d047c171907e71d4c1e91f760154a660566520e3894bf47ab5e0836c4005d49f\": rpc error: code = NotFound desc = could not find container \"d047c171907e71d4c1e91f760154a660566520e3894bf47ab5e0836c4005d49f\": container with ID starting with d047c171907e71d4c1e91f760154a660566520e3894bf47ab5e0836c4005d49f not found: ID does not exist" Feb 24 10:32:59 crc kubenswrapper[4698]: I0224 10:32:59.314436 4698 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-56ph9"] Feb 24 10:32:59 crc kubenswrapper[4698]: E0224 10:32:59.315249 4698 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71a78aac-bc43-4f51-9aff-a9a6bbe1c499" containerName="registry-server" Feb 24 10:32:59 crc kubenswrapper[4698]: I0224 10:32:59.315294 4698 state_mem.go:107] "Deleted CPUSet assignment" podUID="71a78aac-bc43-4f51-9aff-a9a6bbe1c499" containerName="registry-server" Feb 24 10:32:59 crc kubenswrapper[4698]: E0224 10:32:59.315310 4698 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71a78aac-bc43-4f51-9aff-a9a6bbe1c499" containerName="extract-utilities" Feb 24 10:32:59 crc kubenswrapper[4698]: I0224 10:32:59.315317 4698 state_mem.go:107] "Deleted CPUSet assignment" podUID="71a78aac-bc43-4f51-9aff-a9a6bbe1c499" containerName="extract-utilities" Feb 24 10:32:59 crc kubenswrapper[4698]: E0224 10:32:59.315331 4698 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71a78aac-bc43-4f51-9aff-a9a6bbe1c499" containerName="extract-content" Feb 24 10:32:59 crc kubenswrapper[4698]: I0224 10:32:59.315337 4698 state_mem.go:107] "Deleted CPUSet assignment" podUID="71a78aac-bc43-4f51-9aff-a9a6bbe1c499" containerName="extract-content" Feb 24 10:32:59 crc kubenswrapper[4698]: I0224 10:32:59.315460 4698 memory_manager.go:354] "RemoveStaleState removing state" podUID="71a78aac-bc43-4f51-9aff-a9a6bbe1c499" containerName="registry-server" Feb 24 10:32:59 crc kubenswrapper[4698]: I0224 10:32:59.316321 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-56ph9" Feb 24 10:32:59 crc kubenswrapper[4698]: I0224 10:32:59.324331 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-56ph9"] Feb 24 10:32:59 crc kubenswrapper[4698]: I0224 10:32:59.433347 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g62n2\" (UniqueName: \"kubernetes.io/projected/6ea729e0-450e-4289-bd33-658074c05e54-kube-api-access-g62n2\") pod \"community-operators-56ph9\" (UID: \"6ea729e0-450e-4289-bd33-658074c05e54\") " pod="openshift-marketplace/community-operators-56ph9" Feb 24 10:32:59 crc kubenswrapper[4698]: I0224 10:32:59.433411 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ea729e0-450e-4289-bd33-658074c05e54-catalog-content\") pod \"community-operators-56ph9\" (UID: \"6ea729e0-450e-4289-bd33-658074c05e54\") " pod="openshift-marketplace/community-operators-56ph9" Feb 24 10:32:59 crc kubenswrapper[4698]: I0224 10:32:59.433448 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ea729e0-450e-4289-bd33-658074c05e54-utilities\") pod \"community-operators-56ph9\" (UID: \"6ea729e0-450e-4289-bd33-658074c05e54\") " pod="openshift-marketplace/community-operators-56ph9" Feb 24 10:32:59 crc kubenswrapper[4698]: I0224 10:32:59.534376 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ea729e0-450e-4289-bd33-658074c05e54-utilities\") pod \"community-operators-56ph9\" (UID: \"6ea729e0-450e-4289-bd33-658074c05e54\") " pod="openshift-marketplace/community-operators-56ph9" Feb 24 10:32:59 crc kubenswrapper[4698]: I0224 10:32:59.534449 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g62n2\" (UniqueName: \"kubernetes.io/projected/6ea729e0-450e-4289-bd33-658074c05e54-kube-api-access-g62n2\") pod \"community-operators-56ph9\" (UID: \"6ea729e0-450e-4289-bd33-658074c05e54\") " pod="openshift-marketplace/community-operators-56ph9" Feb 24 10:32:59 crc kubenswrapper[4698]: I0224 10:32:59.534490 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ea729e0-450e-4289-bd33-658074c05e54-catalog-content\") pod \"community-operators-56ph9\" (UID: \"6ea729e0-450e-4289-bd33-658074c05e54\") " pod="openshift-marketplace/community-operators-56ph9" Feb 24 10:32:59 crc kubenswrapper[4698]: I0224 10:32:59.534917 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ea729e0-450e-4289-bd33-658074c05e54-catalog-content\") pod \"community-operators-56ph9\" (UID: \"6ea729e0-450e-4289-bd33-658074c05e54\") " pod="openshift-marketplace/community-operators-56ph9" Feb 24 10:32:59 crc kubenswrapper[4698]: I0224 10:32:59.535137 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ea729e0-450e-4289-bd33-658074c05e54-utilities\") pod \"community-operators-56ph9\" (UID: \"6ea729e0-450e-4289-bd33-658074c05e54\") " pod="openshift-marketplace/community-operators-56ph9" Feb 24 10:32:59 crc kubenswrapper[4698]: I0224 10:32:59.567516 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g62n2\" (UniqueName: \"kubernetes.io/projected/6ea729e0-450e-4289-bd33-658074c05e54-kube-api-access-g62n2\") pod \"community-operators-56ph9\" (UID: \"6ea729e0-450e-4289-bd33-658074c05e54\") " pod="openshift-marketplace/community-operators-56ph9" Feb 24 10:32:59 crc kubenswrapper[4698]: I0224 10:32:59.636473 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-56ph9" Feb 24 10:32:59 crc kubenswrapper[4698]: I0224 10:32:59.877197 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-56ph9"] Feb 24 10:33:00 crc kubenswrapper[4698]: I0224 10:33:00.582417 4698 generic.go:334] "Generic (PLEG): container finished" podID="6ea729e0-450e-4289-bd33-658074c05e54" containerID="4293a7acdc7f39e84cfdcd5dbe509144fd76508b91f4326d91ed028a0518ff22" exitCode=0 Feb 24 10:33:00 crc kubenswrapper[4698]: I0224 10:33:00.582500 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-56ph9" event={"ID":"6ea729e0-450e-4289-bd33-658074c05e54","Type":"ContainerDied","Data":"4293a7acdc7f39e84cfdcd5dbe509144fd76508b91f4326d91ed028a0518ff22"} Feb 24 10:33:00 crc kubenswrapper[4698]: I0224 10:33:00.582728 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-56ph9" event={"ID":"6ea729e0-450e-4289-bd33-658074c05e54","Type":"ContainerStarted","Data":"657d0d4aab3ed58dd591d7097fda6d976a27ae08749de83c086fb97f770bf716"} Feb 24 10:33:01 crc kubenswrapper[4698]: I0224 10:33:01.590183 4698 generic.go:334] "Generic (PLEG): container finished" podID="6ea729e0-450e-4289-bd33-658074c05e54" containerID="0f46fc1bc296480119fe9651cc1109579a28d16a1c385c6d6a420da35450d1f9" exitCode=0 Feb 24 10:33:01 crc kubenswrapper[4698]: I0224 10:33:01.590292 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-56ph9" event={"ID":"6ea729e0-450e-4289-bd33-658074c05e54","Type":"ContainerDied","Data":"0f46fc1bc296480119fe9651cc1109579a28d16a1c385c6d6a420da35450d1f9"} Feb 24 10:33:02 crc kubenswrapper[4698]: I0224 10:33:02.598877 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-56ph9" event={"ID":"6ea729e0-450e-4289-bd33-658074c05e54","Type":"ContainerStarted","Data":"83383bc8fd8dacc478d10c8c9e9db12613d8db8ee3392a307f7b2fec053419eb"} Feb 24 10:33:02 crc kubenswrapper[4698]: I0224 10:33:02.614906 4698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-56ph9" podStartSLOduration=2.123309433 podStartE2EDuration="3.614887243s" podCreationTimestamp="2026-02-24 10:32:59 +0000 UTC" firstStartedPulling="2026-02-24 10:33:00.584001465 +0000 UTC m=+1005.697615706" lastFinishedPulling="2026-02-24 10:33:02.075579265 +0000 UTC m=+1007.189193516" observedRunningTime="2026-02-24 10:33:02.613624853 +0000 UTC m=+1007.727239174" watchObservedRunningTime="2026-02-24 10:33:02.614887243 +0000 UTC m=+1007.728501484" Feb 24 10:33:09 crc kubenswrapper[4698]: I0224 10:33:09.636755 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-56ph9" Feb 24 10:33:09 crc kubenswrapper[4698]: I0224 10:33:09.637494 4698 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-56ph9" Feb 24 10:33:09 crc kubenswrapper[4698]: I0224 10:33:09.711465 4698 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-56ph9" Feb 24 10:33:09 crc kubenswrapper[4698]: I0224 10:33:09.778501 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-56ph9" Feb 24 10:33:09 crc kubenswrapper[4698]: I0224 10:33:09.948495 4698 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-56ph9"] Feb 24 10:33:11 crc kubenswrapper[4698]: I0224 10:33:11.657808 4698 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-56ph9" podUID="6ea729e0-450e-4289-bd33-658074c05e54" containerName="registry-server" containerID="cri-o://83383bc8fd8dacc478d10c8c9e9db12613d8db8ee3392a307f7b2fec053419eb" gracePeriod=2 Feb 24 10:33:12 crc kubenswrapper[4698]: I0224 10:33:12.098723 4698 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-56ph9" Feb 24 10:33:12 crc kubenswrapper[4698]: I0224 10:33:12.157363 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ea729e0-450e-4289-bd33-658074c05e54-utilities\") pod \"6ea729e0-450e-4289-bd33-658074c05e54\" (UID: \"6ea729e0-450e-4289-bd33-658074c05e54\") " Feb 24 10:33:12 crc kubenswrapper[4698]: I0224 10:33:12.157431 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ea729e0-450e-4289-bd33-658074c05e54-catalog-content\") pod \"6ea729e0-450e-4289-bd33-658074c05e54\" (UID: \"6ea729e0-450e-4289-bd33-658074c05e54\") " Feb 24 10:33:12 crc kubenswrapper[4698]: I0224 10:33:12.157525 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g62n2\" (UniqueName: \"kubernetes.io/projected/6ea729e0-450e-4289-bd33-658074c05e54-kube-api-access-g62n2\") pod \"6ea729e0-450e-4289-bd33-658074c05e54\" (UID: \"6ea729e0-450e-4289-bd33-658074c05e54\") " Feb 24 10:33:12 crc kubenswrapper[4698]: I0224 10:33:12.159530 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ea729e0-450e-4289-bd33-658074c05e54-utilities" (OuterVolumeSpecName: "utilities") pod "6ea729e0-450e-4289-bd33-658074c05e54" (UID: "6ea729e0-450e-4289-bd33-658074c05e54"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 10:33:12 crc kubenswrapper[4698]: I0224 10:33:12.170571 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea729e0-450e-4289-bd33-658074c05e54-kube-api-access-g62n2" (OuterVolumeSpecName: "kube-api-access-g62n2") pod "6ea729e0-450e-4289-bd33-658074c05e54" (UID: "6ea729e0-450e-4289-bd33-658074c05e54"). InnerVolumeSpecName "kube-api-access-g62n2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:33:12 crc kubenswrapper[4698]: I0224 10:33:12.259125 4698 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ea729e0-450e-4289-bd33-658074c05e54-utilities\") on node \"crc\" DevicePath \"\"" Feb 24 10:33:12 crc kubenswrapper[4698]: I0224 10:33:12.259174 4698 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g62n2\" (UniqueName: \"kubernetes.io/projected/6ea729e0-450e-4289-bd33-658074c05e54-kube-api-access-g62n2\") on node \"crc\" DevicePath \"\"" Feb 24 10:33:12 crc kubenswrapper[4698]: I0224 10:33:12.605656 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ea729e0-450e-4289-bd33-658074c05e54-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6ea729e0-450e-4289-bd33-658074c05e54" (UID: "6ea729e0-450e-4289-bd33-658074c05e54"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 10:33:12 crc kubenswrapper[4698]: I0224 10:33:12.665545 4698 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ea729e0-450e-4289-bd33-658074c05e54-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 24 10:33:12 crc kubenswrapper[4698]: I0224 10:33:12.670299 4698 generic.go:334] "Generic (PLEG): container finished" podID="6ea729e0-450e-4289-bd33-658074c05e54" containerID="83383bc8fd8dacc478d10c8c9e9db12613d8db8ee3392a307f7b2fec053419eb" exitCode=0 Feb 24 10:33:12 crc kubenswrapper[4698]: I0224 10:33:12.670352 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-56ph9" event={"ID":"6ea729e0-450e-4289-bd33-658074c05e54","Type":"ContainerDied","Data":"83383bc8fd8dacc478d10c8c9e9db12613d8db8ee3392a307f7b2fec053419eb"} Feb 24 10:33:12 crc kubenswrapper[4698]: I0224 10:33:12.670383 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-56ph9" event={"ID":"6ea729e0-450e-4289-bd33-658074c05e54","Type":"ContainerDied","Data":"657d0d4aab3ed58dd591d7097fda6d976a27ae08749de83c086fb97f770bf716"} Feb 24 10:33:12 crc kubenswrapper[4698]: I0224 10:33:12.670403 4698 scope.go:117] "RemoveContainer" containerID="83383bc8fd8dacc478d10c8c9e9db12613d8db8ee3392a307f7b2fec053419eb" Feb 24 10:33:12 crc kubenswrapper[4698]: I0224 10:33:12.670544 4698 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-56ph9" Feb 24 10:33:12 crc kubenswrapper[4698]: I0224 10:33:12.706633 4698 scope.go:117] "RemoveContainer" containerID="0f46fc1bc296480119fe9651cc1109579a28d16a1c385c6d6a420da35450d1f9" Feb 24 10:33:12 crc kubenswrapper[4698]: I0224 10:33:12.727466 4698 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-56ph9"] Feb 24 10:33:12 crc kubenswrapper[4698]: I0224 10:33:12.732086 4698 scope.go:117] "RemoveContainer" containerID="4293a7acdc7f39e84cfdcd5dbe509144fd76508b91f4326d91ed028a0518ff22" Feb 24 10:33:12 crc kubenswrapper[4698]: I0224 10:33:12.733103 4698 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-56ph9"] Feb 24 10:33:12 crc kubenswrapper[4698]: I0224 10:33:12.752359 4698 scope.go:117] "RemoveContainer" containerID="83383bc8fd8dacc478d10c8c9e9db12613d8db8ee3392a307f7b2fec053419eb" Feb 24 10:33:12 crc kubenswrapper[4698]: E0224 10:33:12.752809 4698 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83383bc8fd8dacc478d10c8c9e9db12613d8db8ee3392a307f7b2fec053419eb\": container with ID starting with 83383bc8fd8dacc478d10c8c9e9db12613d8db8ee3392a307f7b2fec053419eb not found: ID does not exist" containerID="83383bc8fd8dacc478d10c8c9e9db12613d8db8ee3392a307f7b2fec053419eb" Feb 24 10:33:12 crc kubenswrapper[4698]: I0224 10:33:12.752871 4698 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83383bc8fd8dacc478d10c8c9e9db12613d8db8ee3392a307f7b2fec053419eb"} err="failed to get container status \"83383bc8fd8dacc478d10c8c9e9db12613d8db8ee3392a307f7b2fec053419eb\": rpc error: code = NotFound desc = could not find container \"83383bc8fd8dacc478d10c8c9e9db12613d8db8ee3392a307f7b2fec053419eb\": container with ID starting with 83383bc8fd8dacc478d10c8c9e9db12613d8db8ee3392a307f7b2fec053419eb not found: ID does not exist" Feb 24 10:33:12 crc kubenswrapper[4698]: I0224 10:33:12.752894 4698 scope.go:117] "RemoveContainer" containerID="0f46fc1bc296480119fe9651cc1109579a28d16a1c385c6d6a420da35450d1f9" Feb 24 10:33:12 crc kubenswrapper[4698]: E0224 10:33:12.753190 4698 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f46fc1bc296480119fe9651cc1109579a28d16a1c385c6d6a420da35450d1f9\": container with ID starting with 0f46fc1bc296480119fe9651cc1109579a28d16a1c385c6d6a420da35450d1f9 not found: ID does not exist" containerID="0f46fc1bc296480119fe9651cc1109579a28d16a1c385c6d6a420da35450d1f9" Feb 24 10:33:12 crc kubenswrapper[4698]: I0224 10:33:12.753222 4698 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f46fc1bc296480119fe9651cc1109579a28d16a1c385c6d6a420da35450d1f9"} err="failed to get container status \"0f46fc1bc296480119fe9651cc1109579a28d16a1c385c6d6a420da35450d1f9\": rpc error: code = NotFound desc = could not find container \"0f46fc1bc296480119fe9651cc1109579a28d16a1c385c6d6a420da35450d1f9\": container with ID starting with 0f46fc1bc296480119fe9651cc1109579a28d16a1c385c6d6a420da35450d1f9 not found: ID does not exist" Feb 24 10:33:12 crc kubenswrapper[4698]: I0224 10:33:12.753238 4698 scope.go:117] "RemoveContainer" containerID="4293a7acdc7f39e84cfdcd5dbe509144fd76508b91f4326d91ed028a0518ff22" Feb 24 10:33:12 crc kubenswrapper[4698]: E0224 10:33:12.753648 4698 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4293a7acdc7f39e84cfdcd5dbe509144fd76508b91f4326d91ed028a0518ff22\": container with ID starting with 4293a7acdc7f39e84cfdcd5dbe509144fd76508b91f4326d91ed028a0518ff22 not found: ID does not exist" containerID="4293a7acdc7f39e84cfdcd5dbe509144fd76508b91f4326d91ed028a0518ff22" Feb 24 10:33:12 crc kubenswrapper[4698]: I0224 10:33:12.753721 4698 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4293a7acdc7f39e84cfdcd5dbe509144fd76508b91f4326d91ed028a0518ff22"} err="failed to get container status \"4293a7acdc7f39e84cfdcd5dbe509144fd76508b91f4326d91ed028a0518ff22\": rpc error: code = NotFound desc = could not find container \"4293a7acdc7f39e84cfdcd5dbe509144fd76508b91f4326d91ed028a0518ff22\": container with ID starting with 4293a7acdc7f39e84cfdcd5dbe509144fd76508b91f4326d91ed028a0518ff22 not found: ID does not exist" Feb 24 10:33:13 crc kubenswrapper[4698]: I0224 10:33:13.623081 4698 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea729e0-450e-4289-bd33-658074c05e54" path="/var/lib/kubelet/pods/6ea729e0-450e-4289-bd33-658074c05e54/volumes" Feb 24 10:33:22 crc kubenswrapper[4698]: I0224 10:33:22.196775 4698 patch_prober.go:28] interesting pod/machine-config-daemon-nn578 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 10:33:22 crc kubenswrapper[4698]: I0224 10:33:22.197402 4698 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nn578" podUID="b4ee0bb1-125d-4852-a54d-7dadf6177545" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 24 10:33:22 crc kubenswrapper[4698]: I0224 10:33:22.197464 4698 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-nn578" Feb 24 10:33:22 crc kubenswrapper[4698]: I0224 10:33:22.198724 4698 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5b500e3b8410dff824193492887fc096fc35e4773517369eceee59151bac59ea"} pod="openshift-machine-config-operator/machine-config-daemon-nn578" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 24 10:33:22 crc kubenswrapper[4698]: I0224 10:33:22.198786 4698 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-nn578" podUID="b4ee0bb1-125d-4852-a54d-7dadf6177545" containerName="machine-config-daemon" containerID="cri-o://5b500e3b8410dff824193492887fc096fc35e4773517369eceee59151bac59ea" gracePeriod=600 Feb 24 10:33:22 crc kubenswrapper[4698]: I0224 10:33:22.737619 4698 generic.go:334] "Generic (PLEG): container finished" podID="b4ee0bb1-125d-4852-a54d-7dadf6177545" containerID="5b500e3b8410dff824193492887fc096fc35e4773517369eceee59151bac59ea" exitCode=0 Feb 24 10:33:22 crc kubenswrapper[4698]: I0224 10:33:22.737817 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-nn578" event={"ID":"b4ee0bb1-125d-4852-a54d-7dadf6177545","Type":"ContainerDied","Data":"5b500e3b8410dff824193492887fc096fc35e4773517369eceee59151bac59ea"} Feb 24 10:33:22 crc kubenswrapper[4698]: I0224 10:33:22.737846 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-nn578" event={"ID":"b4ee0bb1-125d-4852-a54d-7dadf6177545","Type":"ContainerStarted","Data":"be499a825046fbb093a7aa6c217bad41975f72830038647cf32f32b9a45b4fd7"} Feb 24 10:33:22 crc kubenswrapper[4698]: I0224 10:33:22.737865 4698 scope.go:117] "RemoveContainer" containerID="829b9213c4c673d3133873424826a5ea12ee4cbf361962bd4c39f0f65c6f48c4" Feb 24 10:33:24 crc kubenswrapper[4698]: I0224 10:33:24.119386 4698 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-77rkl"] Feb 24 10:33:24 crc kubenswrapper[4698]: E0224 10:33:24.119988 4698 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ea729e0-450e-4289-bd33-658074c05e54" containerName="extract-utilities" Feb 24 10:33:24 crc kubenswrapper[4698]: I0224 10:33:24.120008 4698 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ea729e0-450e-4289-bd33-658074c05e54" containerName="extract-utilities" Feb 24 10:33:24 crc kubenswrapper[4698]: E0224 10:33:24.120022 4698 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ea729e0-450e-4289-bd33-658074c05e54" containerName="extract-content" Feb 24 10:33:24 crc kubenswrapper[4698]: I0224 10:33:24.120035 4698 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ea729e0-450e-4289-bd33-658074c05e54" containerName="extract-content" Feb 24 10:33:24 crc kubenswrapper[4698]: E0224 10:33:24.120067 4698 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ea729e0-450e-4289-bd33-658074c05e54" containerName="registry-server" Feb 24 10:33:24 crc kubenswrapper[4698]: I0224 10:33:24.120080 4698 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ea729e0-450e-4289-bd33-658074c05e54" containerName="registry-server" Feb 24 10:33:24 crc kubenswrapper[4698]: I0224 10:33:24.120245 4698 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ea729e0-450e-4289-bd33-658074c05e54" containerName="registry-server" Feb 24 10:33:24 crc kubenswrapper[4698]: I0224 10:33:24.121478 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-77rkl" Feb 24 10:33:24 crc kubenswrapper[4698]: I0224 10:33:24.141762 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-77rkl"] Feb 24 10:33:24 crc kubenswrapper[4698]: I0224 10:33:24.212197 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmn6z\" (UniqueName: \"kubernetes.io/projected/b85b2a18-b2e2-46e0-b3de-6bb323e9300e-kube-api-access-wmn6z\") pod \"certified-operators-77rkl\" (UID: \"b85b2a18-b2e2-46e0-b3de-6bb323e9300e\") " pod="openshift-marketplace/certified-operators-77rkl" Feb 24 10:33:24 crc kubenswrapper[4698]: I0224 10:33:24.212551 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b85b2a18-b2e2-46e0-b3de-6bb323e9300e-utilities\") pod \"certified-operators-77rkl\" (UID: \"b85b2a18-b2e2-46e0-b3de-6bb323e9300e\") " pod="openshift-marketplace/certified-operators-77rkl" Feb 24 10:33:24 crc kubenswrapper[4698]: I0224 10:33:24.212760 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b85b2a18-b2e2-46e0-b3de-6bb323e9300e-catalog-content\") pod \"certified-operators-77rkl\" (UID: \"b85b2a18-b2e2-46e0-b3de-6bb323e9300e\") " pod="openshift-marketplace/certified-operators-77rkl" Feb 24 10:33:24 crc kubenswrapper[4698]: I0224 10:33:24.314609 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmn6z\" (UniqueName: \"kubernetes.io/projected/b85b2a18-b2e2-46e0-b3de-6bb323e9300e-kube-api-access-wmn6z\") pod \"certified-operators-77rkl\" (UID: \"b85b2a18-b2e2-46e0-b3de-6bb323e9300e\") " pod="openshift-marketplace/certified-operators-77rkl" Feb 24 10:33:24 crc kubenswrapper[4698]: I0224 10:33:24.314659 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b85b2a18-b2e2-46e0-b3de-6bb323e9300e-utilities\") pod \"certified-operators-77rkl\" (UID: \"b85b2a18-b2e2-46e0-b3de-6bb323e9300e\") " pod="openshift-marketplace/certified-operators-77rkl" Feb 24 10:33:24 crc kubenswrapper[4698]: I0224 10:33:24.314693 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b85b2a18-b2e2-46e0-b3de-6bb323e9300e-catalog-content\") pod \"certified-operators-77rkl\" (UID: \"b85b2a18-b2e2-46e0-b3de-6bb323e9300e\") " pod="openshift-marketplace/certified-operators-77rkl" Feb 24 10:33:24 crc kubenswrapper[4698]: I0224 10:33:24.315286 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b85b2a18-b2e2-46e0-b3de-6bb323e9300e-catalog-content\") pod \"certified-operators-77rkl\" (UID: \"b85b2a18-b2e2-46e0-b3de-6bb323e9300e\") " pod="openshift-marketplace/certified-operators-77rkl" Feb 24 10:33:24 crc kubenswrapper[4698]: I0224 10:33:24.315632 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b85b2a18-b2e2-46e0-b3de-6bb323e9300e-utilities\") pod \"certified-operators-77rkl\" (UID: \"b85b2a18-b2e2-46e0-b3de-6bb323e9300e\") " pod="openshift-marketplace/certified-operators-77rkl" Feb 24 10:33:24 crc kubenswrapper[4698]: I0224 10:33:24.354786 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmn6z\" (UniqueName: \"kubernetes.io/projected/b85b2a18-b2e2-46e0-b3de-6bb323e9300e-kube-api-access-wmn6z\") pod \"certified-operators-77rkl\" (UID: \"b85b2a18-b2e2-46e0-b3de-6bb323e9300e\") " pod="openshift-marketplace/certified-operators-77rkl" Feb 24 10:33:24 crc kubenswrapper[4698]: I0224 10:33:24.449376 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-77rkl" Feb 24 10:33:24 crc kubenswrapper[4698]: I0224 10:33:24.710531 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-77rkl"] Feb 24 10:33:24 crc kubenswrapper[4698]: W0224 10:33:24.715103 4698 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb85b2a18_b2e2_46e0_b3de_6bb323e9300e.slice/crio-30a06e87985debe48ac119d800f0e2595c80bdda56f332c05896b106b5079a46 WatchSource:0}: Error finding container 30a06e87985debe48ac119d800f0e2595c80bdda56f332c05896b106b5079a46: Status 404 returned error can't find the container with id 30a06e87985debe48ac119d800f0e2595c80bdda56f332c05896b106b5079a46 Feb 24 10:33:24 crc kubenswrapper[4698]: I0224 10:33:24.753723 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-77rkl" event={"ID":"b85b2a18-b2e2-46e0-b3de-6bb323e9300e","Type":"ContainerStarted","Data":"30a06e87985debe48ac119d800f0e2595c80bdda56f332c05896b106b5079a46"} Feb 24 10:33:25 crc kubenswrapper[4698]: I0224 10:33:25.762436 4698 generic.go:334] "Generic (PLEG): container finished" podID="b85b2a18-b2e2-46e0-b3de-6bb323e9300e" containerID="9fa58aa2461cc7225b03c8f8dd847b4529671fd1dac1380a74598608b15b4d9c" exitCode=0 Feb 24 10:33:25 crc kubenswrapper[4698]: I0224 10:33:25.762499 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-77rkl" event={"ID":"b85b2a18-b2e2-46e0-b3de-6bb323e9300e","Type":"ContainerDied","Data":"9fa58aa2461cc7225b03c8f8dd847b4529671fd1dac1380a74598608b15b4d9c"} Feb 24 10:33:26 crc kubenswrapper[4698]: I0224 10:33:26.774055 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-77rkl" event={"ID":"b85b2a18-b2e2-46e0-b3de-6bb323e9300e","Type":"ContainerStarted","Data":"a0765bdbbf1f1cecb17cf291c783a35f21bb7c0723913304cf80f81cc87ffd28"} Feb 24 10:33:27 crc kubenswrapper[4698]: I0224 10:33:27.783248 4698 generic.go:334] "Generic (PLEG): container finished" podID="b85b2a18-b2e2-46e0-b3de-6bb323e9300e" containerID="a0765bdbbf1f1cecb17cf291c783a35f21bb7c0723913304cf80f81cc87ffd28" exitCode=0 Feb 24 10:33:27 crc kubenswrapper[4698]: I0224 10:33:27.783379 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-77rkl" event={"ID":"b85b2a18-b2e2-46e0-b3de-6bb323e9300e","Type":"ContainerDied","Data":"a0765bdbbf1f1cecb17cf291c783a35f21bb7c0723913304cf80f81cc87ffd28"} Feb 24 10:33:28 crc kubenswrapper[4698]: I0224 10:33:28.804713 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-77rkl" event={"ID":"b85b2a18-b2e2-46e0-b3de-6bb323e9300e","Type":"ContainerStarted","Data":"b89ae5e68a23cb05a84d2534788575e8674c9f73bb60618c8afc12b74df57b69"} Feb 24 10:33:28 crc kubenswrapper[4698]: I0224 10:33:28.831587 4698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-77rkl" podStartSLOduration=2.397828916 podStartE2EDuration="4.831572063s" podCreationTimestamp="2026-02-24 10:33:24 +0000 UTC" firstStartedPulling="2026-02-24 10:33:25.764405739 +0000 UTC m=+1030.878019990" lastFinishedPulling="2026-02-24 10:33:28.198148866 +0000 UTC m=+1033.311763137" observedRunningTime="2026-02-24 10:33:28.82729969 +0000 UTC m=+1033.940913981" watchObservedRunningTime="2026-02-24 10:33:28.831572063 +0000 UTC m=+1033.945186304" Feb 24 10:33:34 crc kubenswrapper[4698]: I0224 10:33:34.450005 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-77rkl" Feb 24 10:33:34 crc kubenswrapper[4698]: I0224 10:33:34.452277 4698 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-77rkl" Feb 24 10:33:34 crc kubenswrapper[4698]: I0224 10:33:34.529661 4698 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-77rkl" Feb 24 10:33:34 crc kubenswrapper[4698]: I0224 10:33:34.893544 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-77rkl" Feb 24 10:33:34 crc kubenswrapper[4698]: I0224 10:33:34.962518 4698 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-77rkl"] Feb 24 10:33:36 crc kubenswrapper[4698]: I0224 10:33:36.860738 4698 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-77rkl" podUID="b85b2a18-b2e2-46e0-b3de-6bb323e9300e" containerName="registry-server" containerID="cri-o://b89ae5e68a23cb05a84d2534788575e8674c9f73bb60618c8afc12b74df57b69" gracePeriod=2 Feb 24 10:33:37 crc kubenswrapper[4698]: I0224 10:33:37.870981 4698 generic.go:334] "Generic (PLEG): container finished" podID="b85b2a18-b2e2-46e0-b3de-6bb323e9300e" containerID="b89ae5e68a23cb05a84d2534788575e8674c9f73bb60618c8afc12b74df57b69" exitCode=0 Feb 24 10:33:37 crc kubenswrapper[4698]: I0224 10:33:37.871227 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-77rkl" event={"ID":"b85b2a18-b2e2-46e0-b3de-6bb323e9300e","Type":"ContainerDied","Data":"b89ae5e68a23cb05a84d2534788575e8674c9f73bb60618c8afc12b74df57b69"} Feb 24 10:33:37 crc kubenswrapper[4698]: I0224 10:33:37.907029 4698 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-77rkl" Feb 24 10:33:38 crc kubenswrapper[4698]: I0224 10:33:38.036131 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b85b2a18-b2e2-46e0-b3de-6bb323e9300e-catalog-content\") pod \"b85b2a18-b2e2-46e0-b3de-6bb323e9300e\" (UID: \"b85b2a18-b2e2-46e0-b3de-6bb323e9300e\") " Feb 24 10:33:38 crc kubenswrapper[4698]: I0224 10:33:38.036288 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b85b2a18-b2e2-46e0-b3de-6bb323e9300e-utilities\") pod \"b85b2a18-b2e2-46e0-b3de-6bb323e9300e\" (UID: \"b85b2a18-b2e2-46e0-b3de-6bb323e9300e\") " Feb 24 10:33:38 crc kubenswrapper[4698]: I0224 10:33:38.036346 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wmn6z\" (UniqueName: \"kubernetes.io/projected/b85b2a18-b2e2-46e0-b3de-6bb323e9300e-kube-api-access-wmn6z\") pod \"b85b2a18-b2e2-46e0-b3de-6bb323e9300e\" (UID: \"b85b2a18-b2e2-46e0-b3de-6bb323e9300e\") " Feb 24 10:33:38 crc kubenswrapper[4698]: I0224 10:33:38.037053 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b85b2a18-b2e2-46e0-b3de-6bb323e9300e-utilities" (OuterVolumeSpecName: "utilities") pod "b85b2a18-b2e2-46e0-b3de-6bb323e9300e" (UID: "b85b2a18-b2e2-46e0-b3de-6bb323e9300e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 10:33:38 crc kubenswrapper[4698]: I0224 10:33:38.047757 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b85b2a18-b2e2-46e0-b3de-6bb323e9300e-kube-api-access-wmn6z" (OuterVolumeSpecName: "kube-api-access-wmn6z") pod "b85b2a18-b2e2-46e0-b3de-6bb323e9300e" (UID: "b85b2a18-b2e2-46e0-b3de-6bb323e9300e"). InnerVolumeSpecName "kube-api-access-wmn6z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:33:38 crc kubenswrapper[4698]: I0224 10:33:38.137667 4698 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b85b2a18-b2e2-46e0-b3de-6bb323e9300e-utilities\") on node \"crc\" DevicePath \"\"" Feb 24 10:33:38 crc kubenswrapper[4698]: I0224 10:33:38.137702 4698 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wmn6z\" (UniqueName: \"kubernetes.io/projected/b85b2a18-b2e2-46e0-b3de-6bb323e9300e-kube-api-access-wmn6z\") on node \"crc\" DevicePath \"\"" Feb 24 10:33:38 crc kubenswrapper[4698]: I0224 10:33:38.531719 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b85b2a18-b2e2-46e0-b3de-6bb323e9300e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b85b2a18-b2e2-46e0-b3de-6bb323e9300e" (UID: "b85b2a18-b2e2-46e0-b3de-6bb323e9300e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 10:33:38 crc kubenswrapper[4698]: I0224 10:33:38.544747 4698 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b85b2a18-b2e2-46e0-b3de-6bb323e9300e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 24 10:33:38 crc kubenswrapper[4698]: I0224 10:33:38.883814 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-77rkl" event={"ID":"b85b2a18-b2e2-46e0-b3de-6bb323e9300e","Type":"ContainerDied","Data":"30a06e87985debe48ac119d800f0e2595c80bdda56f332c05896b106b5079a46"} Feb 24 10:33:38 crc kubenswrapper[4698]: I0224 10:33:38.884129 4698 scope.go:117] "RemoveContainer" containerID="b89ae5e68a23cb05a84d2534788575e8674c9f73bb60618c8afc12b74df57b69" Feb 24 10:33:38 crc kubenswrapper[4698]: I0224 10:33:38.884410 4698 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-77rkl" Feb 24 10:33:38 crc kubenswrapper[4698]: I0224 10:33:38.933150 4698 scope.go:117] "RemoveContainer" containerID="a0765bdbbf1f1cecb17cf291c783a35f21bb7c0723913304cf80f81cc87ffd28" Feb 24 10:33:38 crc kubenswrapper[4698]: I0224 10:33:38.947577 4698 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-77rkl"] Feb 24 10:33:38 crc kubenswrapper[4698]: I0224 10:33:38.952103 4698 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-77rkl"] Feb 24 10:33:38 crc kubenswrapper[4698]: I0224 10:33:38.994336 4698 scope.go:117] "RemoveContainer" containerID="9fa58aa2461cc7225b03c8f8dd847b4529671fd1dac1380a74598608b15b4d9c" Feb 24 10:33:39 crc kubenswrapper[4698]: I0224 10:33:39.633524 4698 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b85b2a18-b2e2-46e0-b3de-6bb323e9300e" path="/var/lib/kubelet/pods/b85b2a18-b2e2-46e0-b3de-6bb323e9300e/volumes" Feb 24 10:33:51 crc kubenswrapper[4698]: I0224 10:33:51.155932 4698 generic.go:334] "Generic (PLEG): container finished" podID="e026ae3f-b95b-4b97-ac97-04c34a90bcca" containerID="302f9c37d4de708e7856de8e63b315d70a13afdbff2554c082db09e34396067d" exitCode=0 Feb 24 10:33:51 crc kubenswrapper[4698]: I0224 10:33:51.156075 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vjgsb/must-gather-m5gz6" event={"ID":"e026ae3f-b95b-4b97-ac97-04c34a90bcca","Type":"ContainerDied","Data":"302f9c37d4de708e7856de8e63b315d70a13afdbff2554c082db09e34396067d"} Feb 24 10:33:51 crc kubenswrapper[4698]: I0224 10:33:51.157176 4698 scope.go:117] "RemoveContainer" containerID="302f9c37d4de708e7856de8e63b315d70a13afdbff2554c082db09e34396067d" Feb 24 10:33:56 crc kubenswrapper[4698]: I0224 10:33:56.135842 4698 ???:1] "http: TLS handshake error from 192.168.126.11:59108: no serving certificate available for the kubelet" Feb 24 10:33:56 crc kubenswrapper[4698]: I0224 10:33:56.318174 4698 ???:1] "http: TLS handshake error from 192.168.126.11:59120: no serving certificate available for the kubelet" Feb 24 10:33:56 crc kubenswrapper[4698]: I0224 10:33:56.328883 4698 ???:1] "http: TLS handshake error from 192.168.126.11:59122: no serving certificate available for the kubelet" Feb 24 10:33:56 crc kubenswrapper[4698]: I0224 10:33:56.353082 4698 ???:1] "http: TLS handshake error from 192.168.126.11:59130: no serving certificate available for the kubelet" Feb 24 10:33:56 crc kubenswrapper[4698]: I0224 10:33:56.365909 4698 ???:1] "http: TLS handshake error from 192.168.126.11:59146: no serving certificate available for the kubelet" Feb 24 10:33:56 crc kubenswrapper[4698]: I0224 10:33:56.381094 4698 ???:1] "http: TLS handshake error from 192.168.126.11:59156: no serving certificate available for the kubelet" Feb 24 10:33:56 crc kubenswrapper[4698]: I0224 10:33:56.394382 4698 ???:1] "http: TLS handshake error from 192.168.126.11:59172: no serving certificate available for the kubelet" Feb 24 10:33:56 crc kubenswrapper[4698]: I0224 10:33:56.409991 4698 ???:1] "http: TLS handshake error from 192.168.126.11:59188: no serving certificate available for the kubelet" Feb 24 10:33:56 crc kubenswrapper[4698]: I0224 10:33:56.421967 4698 ???:1] "http: TLS handshake error from 192.168.126.11:59196: no serving certificate available for the kubelet" Feb 24 10:33:56 crc kubenswrapper[4698]: I0224 10:33:56.569724 4698 ???:1] "http: TLS handshake error from 192.168.126.11:59204: no serving certificate available for the kubelet" Feb 24 10:33:56 crc kubenswrapper[4698]: I0224 10:33:56.582166 4698 ???:1] "http: TLS handshake error from 192.168.126.11:59206: no serving certificate available for the kubelet" Feb 24 10:33:56 crc kubenswrapper[4698]: I0224 10:33:56.604936 4698 ???:1] "http: TLS handshake error from 192.168.126.11:59214: no serving certificate available for the kubelet" Feb 24 10:33:56 crc kubenswrapper[4698]: I0224 10:33:56.617305 4698 ???:1] "http: TLS handshake error from 192.168.126.11:59224: no serving certificate available for the kubelet" Feb 24 10:33:56 crc kubenswrapper[4698]: I0224 10:33:56.631102 4698 ???:1] "http: TLS handshake error from 192.168.126.11:59236: no serving certificate available for the kubelet" Feb 24 10:33:56 crc kubenswrapper[4698]: I0224 10:33:56.643438 4698 ???:1] "http: TLS handshake error from 192.168.126.11:59240: no serving certificate available for the kubelet" Feb 24 10:33:56 crc kubenswrapper[4698]: I0224 10:33:56.656926 4698 ???:1] "http: TLS handshake error from 192.168.126.11:59254: no serving certificate available for the kubelet" Feb 24 10:33:56 crc kubenswrapper[4698]: I0224 10:33:56.665803 4698 ???:1] "http: TLS handshake error from 192.168.126.11:59270: no serving certificate available for the kubelet" Feb 24 10:33:58 crc kubenswrapper[4698]: I0224 10:33:58.497367 4698 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zlh2h"] Feb 24 10:33:58 crc kubenswrapper[4698]: E0224 10:33:58.498370 4698 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b85b2a18-b2e2-46e0-b3de-6bb323e9300e" containerName="registry-server" Feb 24 10:33:58 crc kubenswrapper[4698]: I0224 10:33:58.498393 4698 state_mem.go:107] "Deleted CPUSet assignment" podUID="b85b2a18-b2e2-46e0-b3de-6bb323e9300e" containerName="registry-server" Feb 24 10:33:58 crc kubenswrapper[4698]: E0224 10:33:58.498430 4698 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b85b2a18-b2e2-46e0-b3de-6bb323e9300e" containerName="extract-utilities" Feb 24 10:33:58 crc kubenswrapper[4698]: I0224 10:33:58.498447 4698 state_mem.go:107] "Deleted CPUSet assignment" podUID="b85b2a18-b2e2-46e0-b3de-6bb323e9300e" containerName="extract-utilities" Feb 24 10:33:58 crc kubenswrapper[4698]: E0224 10:33:58.498474 4698 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b85b2a18-b2e2-46e0-b3de-6bb323e9300e" containerName="extract-content" Feb 24 10:33:58 crc kubenswrapper[4698]: I0224 10:33:58.498491 4698 state_mem.go:107] "Deleted CPUSet assignment" podUID="b85b2a18-b2e2-46e0-b3de-6bb323e9300e" containerName="extract-content" Feb 24 10:33:58 crc kubenswrapper[4698]: I0224 10:33:58.498808 4698 memory_manager.go:354] "RemoveStaleState removing state" podUID="b85b2a18-b2e2-46e0-b3de-6bb323e9300e" containerName="registry-server" Feb 24 10:33:58 crc kubenswrapper[4698]: I0224 10:33:58.500188 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zlh2h" Feb 24 10:33:58 crc kubenswrapper[4698]: I0224 10:33:58.510150 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zlh2h"] Feb 24 10:33:58 crc kubenswrapper[4698]: I0224 10:33:58.629405 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e367f7aa-c587-4df4-9a1c-0f3448a72faf-utilities\") pod \"redhat-operators-zlh2h\" (UID: \"e367f7aa-c587-4df4-9a1c-0f3448a72faf\") " pod="openshift-marketplace/redhat-operators-zlh2h" Feb 24 10:33:58 crc kubenswrapper[4698]: I0224 10:33:58.629539 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xbg8\" (UniqueName: \"kubernetes.io/projected/e367f7aa-c587-4df4-9a1c-0f3448a72faf-kube-api-access-8xbg8\") pod \"redhat-operators-zlh2h\" (UID: \"e367f7aa-c587-4df4-9a1c-0f3448a72faf\") " pod="openshift-marketplace/redhat-operators-zlh2h" Feb 24 10:33:58 crc kubenswrapper[4698]: I0224 10:33:58.629731 4698 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e367f7aa-c587-4df4-9a1c-0f3448a72faf-catalog-content\") pod \"redhat-operators-zlh2h\" (UID: \"e367f7aa-c587-4df4-9a1c-0f3448a72faf\") " pod="openshift-marketplace/redhat-operators-zlh2h" Feb 24 10:33:58 crc kubenswrapper[4698]: I0224 10:33:58.730375 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e367f7aa-c587-4df4-9a1c-0f3448a72faf-utilities\") pod \"redhat-operators-zlh2h\" (UID: \"e367f7aa-c587-4df4-9a1c-0f3448a72faf\") " pod="openshift-marketplace/redhat-operators-zlh2h" Feb 24 10:33:58 crc kubenswrapper[4698]: I0224 10:33:58.730427 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xbg8\" (UniqueName: \"kubernetes.io/projected/e367f7aa-c587-4df4-9a1c-0f3448a72faf-kube-api-access-8xbg8\") pod \"redhat-operators-zlh2h\" (UID: \"e367f7aa-c587-4df4-9a1c-0f3448a72faf\") " pod="openshift-marketplace/redhat-operators-zlh2h" Feb 24 10:33:58 crc kubenswrapper[4698]: I0224 10:33:58.730478 4698 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e367f7aa-c587-4df4-9a1c-0f3448a72faf-catalog-content\") pod \"redhat-operators-zlh2h\" (UID: \"e367f7aa-c587-4df4-9a1c-0f3448a72faf\") " pod="openshift-marketplace/redhat-operators-zlh2h" Feb 24 10:33:58 crc kubenswrapper[4698]: I0224 10:33:58.730921 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e367f7aa-c587-4df4-9a1c-0f3448a72faf-catalog-content\") pod \"redhat-operators-zlh2h\" (UID: \"e367f7aa-c587-4df4-9a1c-0f3448a72faf\") " pod="openshift-marketplace/redhat-operators-zlh2h" Feb 24 10:33:58 crc kubenswrapper[4698]: I0224 10:33:58.731167 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e367f7aa-c587-4df4-9a1c-0f3448a72faf-utilities\") pod \"redhat-operators-zlh2h\" (UID: \"e367f7aa-c587-4df4-9a1c-0f3448a72faf\") " pod="openshift-marketplace/redhat-operators-zlh2h" Feb 24 10:33:58 crc kubenswrapper[4698]: I0224 10:33:58.760794 4698 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xbg8\" (UniqueName: \"kubernetes.io/projected/e367f7aa-c587-4df4-9a1c-0f3448a72faf-kube-api-access-8xbg8\") pod \"redhat-operators-zlh2h\" (UID: \"e367f7aa-c587-4df4-9a1c-0f3448a72faf\") " pod="openshift-marketplace/redhat-operators-zlh2h" Feb 24 10:33:58 crc kubenswrapper[4698]: I0224 10:33:58.831658 4698 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zlh2h" Feb 24 10:33:59 crc kubenswrapper[4698]: I0224 10:33:59.270635 4698 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zlh2h"] Feb 24 10:34:00 crc kubenswrapper[4698]: I0224 10:34:00.215363 4698 generic.go:334] "Generic (PLEG): container finished" podID="e367f7aa-c587-4df4-9a1c-0f3448a72faf" containerID="f9408b3bf28909a9fcd0402a2186ec06ee660896cb7246256fd49046bc621c33" exitCode=0 Feb 24 10:34:00 crc kubenswrapper[4698]: I0224 10:34:00.215473 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zlh2h" event={"ID":"e367f7aa-c587-4df4-9a1c-0f3448a72faf","Type":"ContainerDied","Data":"f9408b3bf28909a9fcd0402a2186ec06ee660896cb7246256fd49046bc621c33"} Feb 24 10:34:00 crc kubenswrapper[4698]: I0224 10:34:00.215787 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zlh2h" event={"ID":"e367f7aa-c587-4df4-9a1c-0f3448a72faf","Type":"ContainerStarted","Data":"140bbc5903b1eae40f8275b686a47251319c35616fff2fc434eea21cd9420f5b"} Feb 24 10:34:01 crc kubenswrapper[4698]: I0224 10:34:01.228467 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zlh2h" event={"ID":"e367f7aa-c587-4df4-9a1c-0f3448a72faf","Type":"ContainerStarted","Data":"7157a1aad4e1a35bee00a755e164ad21fd1c52108f2f6a5eedc37b54ee240e6f"} Feb 24 10:34:01 crc kubenswrapper[4698]: I0224 10:34:01.862511 4698 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-vjgsb/must-gather-m5gz6"] Feb 24 10:34:01 crc kubenswrapper[4698]: I0224 10:34:01.862873 4698 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-vjgsb/must-gather-m5gz6" podUID="e026ae3f-b95b-4b97-ac97-04c34a90bcca" containerName="copy" containerID="cri-o://9ea1992c072c411564c741b79f164eb5fd51a77cb7dbb039365489b4dd81d475" gracePeriod=2 Feb 24 10:34:01 crc kubenswrapper[4698]: I0224 10:34:01.867044 4698 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-vjgsb/must-gather-m5gz6"] Feb 24 10:34:02 crc kubenswrapper[4698]: I0224 10:34:02.239727 4698 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-vjgsb_must-gather-m5gz6_e026ae3f-b95b-4b97-ac97-04c34a90bcca/copy/0.log" Feb 24 10:34:02 crc kubenswrapper[4698]: I0224 10:34:02.240822 4698 generic.go:334] "Generic (PLEG): container finished" podID="e026ae3f-b95b-4b97-ac97-04c34a90bcca" containerID="9ea1992c072c411564c741b79f164eb5fd51a77cb7dbb039365489b4dd81d475" exitCode=143 Feb 24 10:34:02 crc kubenswrapper[4698]: I0224 10:34:02.240889 4698 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8aced5f31ffe0beac7e356954930839e67855d266f2b951c7369d9565e2d4cc5" Feb 24 10:34:02 crc kubenswrapper[4698]: I0224 10:34:02.247764 4698 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-vjgsb_must-gather-m5gz6_e026ae3f-b95b-4b97-ac97-04c34a90bcca/copy/0.log" Feb 24 10:34:02 crc kubenswrapper[4698]: I0224 10:34:02.248110 4698 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vjgsb/must-gather-m5gz6" Feb 24 10:34:02 crc kubenswrapper[4698]: I0224 10:34:02.249642 4698 generic.go:334] "Generic (PLEG): container finished" podID="e367f7aa-c587-4df4-9a1c-0f3448a72faf" containerID="7157a1aad4e1a35bee00a755e164ad21fd1c52108f2f6a5eedc37b54ee240e6f" exitCode=0 Feb 24 10:34:02 crc kubenswrapper[4698]: I0224 10:34:02.249680 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zlh2h" event={"ID":"e367f7aa-c587-4df4-9a1c-0f3448a72faf","Type":"ContainerDied","Data":"7157a1aad4e1a35bee00a755e164ad21fd1c52108f2f6a5eedc37b54ee240e6f"} Feb 24 10:34:02 crc kubenswrapper[4698]: I0224 10:34:02.379381 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvqm\" (UniqueName: \"kubernetes.io/projected/e026ae3f-b95b-4b97-ac97-04c34a90bcca-kube-api-access-zkvqm\") pod \"e026ae3f-b95b-4b97-ac97-04c34a90bcca\" (UID: \"e026ae3f-b95b-4b97-ac97-04c34a90bcca\") " Feb 24 10:34:02 crc kubenswrapper[4698]: I0224 10:34:02.379443 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e026ae3f-b95b-4b97-ac97-04c34a90bcca-must-gather-output\") pod \"e026ae3f-b95b-4b97-ac97-04c34a90bcca\" (UID: \"e026ae3f-b95b-4b97-ac97-04c34a90bcca\") " Feb 24 10:34:02 crc kubenswrapper[4698]: I0224 10:34:02.384092 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e026ae3f-b95b-4b97-ac97-04c34a90bcca-kube-api-access-zkvqm" (OuterVolumeSpecName: "kube-api-access-zkvqm") pod "e026ae3f-b95b-4b97-ac97-04c34a90bcca" (UID: "e026ae3f-b95b-4b97-ac97-04c34a90bcca"). InnerVolumeSpecName "kube-api-access-zkvqm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:34:02 crc kubenswrapper[4698]: I0224 10:34:02.434056 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e026ae3f-b95b-4b97-ac97-04c34a90bcca-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "e026ae3f-b95b-4b97-ac97-04c34a90bcca" (UID: "e026ae3f-b95b-4b97-ac97-04c34a90bcca"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 10:34:02 crc kubenswrapper[4698]: I0224 10:34:02.480666 4698 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e026ae3f-b95b-4b97-ac97-04c34a90bcca-must-gather-output\") on node \"crc\" DevicePath \"\"" Feb 24 10:34:02 crc kubenswrapper[4698]: I0224 10:34:02.480947 4698 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvqm\" (UniqueName: \"kubernetes.io/projected/e026ae3f-b95b-4b97-ac97-04c34a90bcca-kube-api-access-zkvqm\") on node \"crc\" DevicePath \"\"" Feb 24 10:34:03 crc kubenswrapper[4698]: I0224 10:34:03.265174 4698 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vjgsb/must-gather-m5gz6" Feb 24 10:34:03 crc kubenswrapper[4698]: I0224 10:34:03.266009 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zlh2h" event={"ID":"e367f7aa-c587-4df4-9a1c-0f3448a72faf","Type":"ContainerStarted","Data":"cc2e43ab77cc2bafde68c0e44f9b52fea2be07aa67ea6df68533497af955ee5e"} Feb 24 10:34:03 crc kubenswrapper[4698]: I0224 10:34:03.295941 4698 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zlh2h" podStartSLOduration=2.816986205 podStartE2EDuration="5.295923326s" podCreationTimestamp="2026-02-24 10:33:58 +0000 UTC" firstStartedPulling="2026-02-24 10:34:00.218176369 +0000 UTC m=+1065.331790610" lastFinishedPulling="2026-02-24 10:34:02.69711346 +0000 UTC m=+1067.810727731" observedRunningTime="2026-02-24 10:34:03.293048187 +0000 UTC m=+1068.406662438" watchObservedRunningTime="2026-02-24 10:34:03.295923326 +0000 UTC m=+1068.409537587" Feb 24 10:34:03 crc kubenswrapper[4698]: I0224 10:34:03.628101 4698 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e026ae3f-b95b-4b97-ac97-04c34a90bcca" path="/var/lib/kubelet/pods/e026ae3f-b95b-4b97-ac97-04c34a90bcca/volumes" Feb 24 10:34:08 crc kubenswrapper[4698]: I0224 10:34:08.832155 4698 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zlh2h" Feb 24 10:34:08 crc kubenswrapper[4698]: I0224 10:34:08.832877 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zlh2h" Feb 24 10:34:09 crc kubenswrapper[4698]: I0224 10:34:09.883498 4698 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-zlh2h" podUID="e367f7aa-c587-4df4-9a1c-0f3448a72faf" containerName="registry-server" probeResult="failure" output=< Feb 24 10:34:09 crc kubenswrapper[4698]: timeout: failed to connect service ":50051" within 1s Feb 24 10:34:09 crc kubenswrapper[4698]: > Feb 24 10:34:18 crc kubenswrapper[4698]: I0224 10:34:18.880661 4698 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zlh2h" Feb 24 10:34:18 crc kubenswrapper[4698]: I0224 10:34:18.950230 4698 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zlh2h" Feb 24 10:34:19 crc kubenswrapper[4698]: I0224 10:34:19.117228 4698 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zlh2h"] Feb 24 10:34:20 crc kubenswrapper[4698]: I0224 10:34:20.391013 4698 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-zlh2h" podUID="e367f7aa-c587-4df4-9a1c-0f3448a72faf" containerName="registry-server" containerID="cri-o://cc2e43ab77cc2bafde68c0e44f9b52fea2be07aa67ea6df68533497af955ee5e" gracePeriod=2 Feb 24 10:34:20 crc kubenswrapper[4698]: E0224 10:34:20.670088 4698 certificate_manager.go:579] "Unhandled Error" err="kubernetes.io/kubelet-serving: certificate request was not signed: timed out waiting for the condition" logger="UnhandledError" Feb 24 10:34:20 crc kubenswrapper[4698]: I0224 10:34:20.774379 4698 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zlh2h" Feb 24 10:34:20 crc kubenswrapper[4698]: I0224 10:34:20.836841 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e367f7aa-c587-4df4-9a1c-0f3448a72faf-utilities\") pod \"e367f7aa-c587-4df4-9a1c-0f3448a72faf\" (UID: \"e367f7aa-c587-4df4-9a1c-0f3448a72faf\") " Feb 24 10:34:20 crc kubenswrapper[4698]: I0224 10:34:20.836901 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8xbg8\" (UniqueName: \"kubernetes.io/projected/e367f7aa-c587-4df4-9a1c-0f3448a72faf-kube-api-access-8xbg8\") pod \"e367f7aa-c587-4df4-9a1c-0f3448a72faf\" (UID: \"e367f7aa-c587-4df4-9a1c-0f3448a72faf\") " Feb 24 10:34:20 crc kubenswrapper[4698]: I0224 10:34:20.836945 4698 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e367f7aa-c587-4df4-9a1c-0f3448a72faf-catalog-content\") pod \"e367f7aa-c587-4df4-9a1c-0f3448a72faf\" (UID: \"e367f7aa-c587-4df4-9a1c-0f3448a72faf\") " Feb 24 10:34:20 crc kubenswrapper[4698]: I0224 10:34:20.838370 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e367f7aa-c587-4df4-9a1c-0f3448a72faf-utilities" (OuterVolumeSpecName: "utilities") pod "e367f7aa-c587-4df4-9a1c-0f3448a72faf" (UID: "e367f7aa-c587-4df4-9a1c-0f3448a72faf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 10:34:20 crc kubenswrapper[4698]: I0224 10:34:20.843519 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e367f7aa-c587-4df4-9a1c-0f3448a72faf-kube-api-access-8xbg8" (OuterVolumeSpecName: "kube-api-access-8xbg8") pod "e367f7aa-c587-4df4-9a1c-0f3448a72faf" (UID: "e367f7aa-c587-4df4-9a1c-0f3448a72faf"). InnerVolumeSpecName "kube-api-access-8xbg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 24 10:34:20 crc kubenswrapper[4698]: I0224 10:34:20.937800 4698 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8xbg8\" (UniqueName: \"kubernetes.io/projected/e367f7aa-c587-4df4-9a1c-0f3448a72faf-kube-api-access-8xbg8\") on node \"crc\" DevicePath \"\"" Feb 24 10:34:20 crc kubenswrapper[4698]: I0224 10:34:20.937844 4698 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e367f7aa-c587-4df4-9a1c-0f3448a72faf-utilities\") on node \"crc\" DevicePath \"\"" Feb 24 10:34:20 crc kubenswrapper[4698]: I0224 10:34:20.956937 4698 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e367f7aa-c587-4df4-9a1c-0f3448a72faf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e367f7aa-c587-4df4-9a1c-0f3448a72faf" (UID: "e367f7aa-c587-4df4-9a1c-0f3448a72faf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 24 10:34:21 crc kubenswrapper[4698]: I0224 10:34:21.038700 4698 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e367f7aa-c587-4df4-9a1c-0f3448a72faf-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 24 10:34:21 crc kubenswrapper[4698]: I0224 10:34:21.401184 4698 generic.go:334] "Generic (PLEG): container finished" podID="e367f7aa-c587-4df4-9a1c-0f3448a72faf" containerID="cc2e43ab77cc2bafde68c0e44f9b52fea2be07aa67ea6df68533497af955ee5e" exitCode=0 Feb 24 10:34:21 crc kubenswrapper[4698]: I0224 10:34:21.401232 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zlh2h" event={"ID":"e367f7aa-c587-4df4-9a1c-0f3448a72faf","Type":"ContainerDied","Data":"cc2e43ab77cc2bafde68c0e44f9b52fea2be07aa67ea6df68533497af955ee5e"} Feb 24 10:34:21 crc kubenswrapper[4698]: I0224 10:34:21.401281 4698 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zlh2h" event={"ID":"e367f7aa-c587-4df4-9a1c-0f3448a72faf","Type":"ContainerDied","Data":"140bbc5903b1eae40f8275b686a47251319c35616fff2fc434eea21cd9420f5b"} Feb 24 10:34:21 crc kubenswrapper[4698]: I0224 10:34:21.401303 4698 scope.go:117] "RemoveContainer" containerID="cc2e43ab77cc2bafde68c0e44f9b52fea2be07aa67ea6df68533497af955ee5e" Feb 24 10:34:21 crc kubenswrapper[4698]: I0224 10:34:21.401445 4698 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zlh2h" Feb 24 10:34:21 crc kubenswrapper[4698]: I0224 10:34:21.439332 4698 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-zlh2h"] Feb 24 10:34:21 crc kubenswrapper[4698]: I0224 10:34:21.439416 4698 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-zlh2h"] Feb 24 10:34:21 crc kubenswrapper[4698]: I0224 10:34:21.439430 4698 scope.go:117] "RemoveContainer" containerID="7157a1aad4e1a35bee00a755e164ad21fd1c52108f2f6a5eedc37b54ee240e6f" Feb 24 10:34:21 crc kubenswrapper[4698]: I0224 10:34:21.481536 4698 scope.go:117] "RemoveContainer" containerID="f9408b3bf28909a9fcd0402a2186ec06ee660896cb7246256fd49046bc621c33" Feb 24 10:34:21 crc kubenswrapper[4698]: I0224 10:34:21.497322 4698 scope.go:117] "RemoveContainer" containerID="cc2e43ab77cc2bafde68c0e44f9b52fea2be07aa67ea6df68533497af955ee5e" Feb 24 10:34:21 crc kubenswrapper[4698]: E0224 10:34:21.497873 4698 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc2e43ab77cc2bafde68c0e44f9b52fea2be07aa67ea6df68533497af955ee5e\": container with ID starting with cc2e43ab77cc2bafde68c0e44f9b52fea2be07aa67ea6df68533497af955ee5e not found: ID does not exist" containerID="cc2e43ab77cc2bafde68c0e44f9b52fea2be07aa67ea6df68533497af955ee5e" Feb 24 10:34:21 crc kubenswrapper[4698]: I0224 10:34:21.497931 4698 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc2e43ab77cc2bafde68c0e44f9b52fea2be07aa67ea6df68533497af955ee5e"} err="failed to get container status \"cc2e43ab77cc2bafde68c0e44f9b52fea2be07aa67ea6df68533497af955ee5e\": rpc error: code = NotFound desc = could not find container \"cc2e43ab77cc2bafde68c0e44f9b52fea2be07aa67ea6df68533497af955ee5e\": container with ID starting with cc2e43ab77cc2bafde68c0e44f9b52fea2be07aa67ea6df68533497af955ee5e not found: ID does not exist" Feb 24 10:34:21 crc kubenswrapper[4698]: I0224 10:34:21.497960 4698 scope.go:117] "RemoveContainer" containerID="7157a1aad4e1a35bee00a755e164ad21fd1c52108f2f6a5eedc37b54ee240e6f" Feb 24 10:34:21 crc kubenswrapper[4698]: E0224 10:34:21.498332 4698 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7157a1aad4e1a35bee00a755e164ad21fd1c52108f2f6a5eedc37b54ee240e6f\": container with ID starting with 7157a1aad4e1a35bee00a755e164ad21fd1c52108f2f6a5eedc37b54ee240e6f not found: ID does not exist" containerID="7157a1aad4e1a35bee00a755e164ad21fd1c52108f2f6a5eedc37b54ee240e6f" Feb 24 10:34:21 crc kubenswrapper[4698]: I0224 10:34:21.498378 4698 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7157a1aad4e1a35bee00a755e164ad21fd1c52108f2f6a5eedc37b54ee240e6f"} err="failed to get container status \"7157a1aad4e1a35bee00a755e164ad21fd1c52108f2f6a5eedc37b54ee240e6f\": rpc error: code = NotFound desc = could not find container \"7157a1aad4e1a35bee00a755e164ad21fd1c52108f2f6a5eedc37b54ee240e6f\": container with ID starting with 7157a1aad4e1a35bee00a755e164ad21fd1c52108f2f6a5eedc37b54ee240e6f not found: ID does not exist" Feb 24 10:34:21 crc kubenswrapper[4698]: I0224 10:34:21.498399 4698 scope.go:117] "RemoveContainer" containerID="f9408b3bf28909a9fcd0402a2186ec06ee660896cb7246256fd49046bc621c33" Feb 24 10:34:21 crc kubenswrapper[4698]: E0224 10:34:21.498739 4698 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9408b3bf28909a9fcd0402a2186ec06ee660896cb7246256fd49046bc621c33\": container with ID starting with f9408b3bf28909a9fcd0402a2186ec06ee660896cb7246256fd49046bc621c33 not found: ID does not exist" containerID="f9408b3bf28909a9fcd0402a2186ec06ee660896cb7246256fd49046bc621c33" Feb 24 10:34:21 crc kubenswrapper[4698]: I0224 10:34:21.498800 4698 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9408b3bf28909a9fcd0402a2186ec06ee660896cb7246256fd49046bc621c33"} err="failed to get container status \"f9408b3bf28909a9fcd0402a2186ec06ee660896cb7246256fd49046bc621c33\": rpc error: code = NotFound desc = could not find container \"f9408b3bf28909a9fcd0402a2186ec06ee660896cb7246256fd49046bc621c33\": container with ID starting with f9408b3bf28909a9fcd0402a2186ec06ee660896cb7246256fd49046bc621c33 not found: ID does not exist" Feb 24 10:34:21 crc kubenswrapper[4698]: I0224 10:34:21.626647 4698 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e367f7aa-c587-4df4-9a1c-0f3448a72faf" path="/var/lib/kubelet/pods/e367f7aa-c587-4df4-9a1c-0f3448a72faf/volumes" Feb 24 10:34:22 crc kubenswrapper[4698]: I0224 10:34:22.747488 4698 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Feb 24 10:34:22 crc kubenswrapper[4698]: I0224 10:34:22.757199 4698 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 24 10:34:22 crc kubenswrapper[4698]: I0224 10:34:22.776927 4698 ???:1] "http: TLS handshake error from 192.168.126.11:57892: no serving certificate available for the kubelet" Feb 24 10:34:22 crc kubenswrapper[4698]: I0224 10:34:22.803915 4698 ???:1] "http: TLS handshake error from 192.168.126.11:57908: no serving certificate available for the kubelet" Feb 24 10:34:22 crc kubenswrapper[4698]: I0224 10:34:22.837497 4698 ???:1] "http: TLS handshake error from 192.168.126.11:57920: no serving certificate available for the kubelet" Feb 24 10:34:22 crc kubenswrapper[4698]: I0224 10:34:22.877247 4698 ???:1] "http: TLS handshake error from 192.168.126.11:57932: no serving certificate available for the kubelet" Feb 24 10:34:22 crc kubenswrapper[4698]: I0224 10:34:22.938962 4698 ???:1] "http: TLS handshake error from 192.168.126.11:57946: no serving certificate available for the kubelet" Feb 24 10:34:23 crc kubenswrapper[4698]: I0224 10:34:23.040202 4698 ???:1] "http: TLS handshake error from 192.168.126.11:57962: no serving certificate available for the kubelet" Feb 24 10:34:23 crc kubenswrapper[4698]: I0224 10:34:23.219213 4698 ???:1] "http: TLS handshake error from 192.168.126.11:57964: no serving certificate available for the kubelet" Feb 24 10:34:23 crc kubenswrapper[4698]: I0224 10:34:23.560842 4698 ???:1] "http: TLS handshake error from 192.168.126.11:57976: no serving certificate available for the kubelet" Feb 24 10:34:24 crc kubenswrapper[4698]: I0224 10:34:24.221075 4698 ???:1] "http: TLS handshake error from 192.168.126.11:57988: no serving certificate available for the kubelet" Feb 24 10:34:25 crc kubenswrapper[4698]: I0224 10:34:25.532033 4698 ???:1] "http: TLS handshake error from 192.168.126.11:58000: no serving certificate available for the kubelet" Feb 24 10:34:28 crc kubenswrapper[4698]: I0224 10:34:28.116155 4698 ???:1] "http: TLS handshake error from 192.168.126.11:58008: no serving certificate available for the kubelet" Feb 24 10:34:33 crc kubenswrapper[4698]: I0224 10:34:33.271009 4698 ???:1] "http: TLS handshake error from 192.168.126.11:47990: no serving certificate available for the kubelet" Feb 24 10:34:43 crc kubenswrapper[4698]: I0224 10:34:43.542693 4698 ???:1] "http: TLS handshake error from 192.168.126.11:60270: no serving certificate available for the kubelet" Feb 24 10:35:04 crc kubenswrapper[4698]: I0224 10:35:04.056958 4698 ???:1] "http: TLS handshake error from 192.168.126.11:43120: no serving certificate available for the kubelet" Feb 24 10:35:22 crc kubenswrapper[4698]: I0224 10:35:22.196862 4698 patch_prober.go:28] interesting pod/machine-config-daemon-nn578 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 24 10:35:22 crc kubenswrapper[4698]: I0224 10:35:22.197493 4698 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-nn578" podUID="b4ee0bb1-125d-4852-a54d-7dadf6177545" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515147277404024460 0ustar coreroot  Om77'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015147277405017376 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015147274640016517 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015147274641015470 5ustar corecore